How To Dynamic Web Rendering?

Alexander aldem+dmars at nk7.net
Mon May 16 08:11:24 PDT 2011


On 15.05.2011 20:54, Adam D. Ruppe wrote:

> FYI, PHP uses files on the hard drive for sessions by default...
> optionally, it can use a database too.

  Not really. There are different options to keep the session data, though - so some of them may resort to store something in the disk. Both cases (disk/db) terribly slow down everything.

  I don't know how many visitors your websites have, but if you have several visits per second - you will feel it.

> AFAIK, there is no in-memory option beyond the options the kernel or database provide for file/table caching.

  There are several, again - like memcache. Believe me, once you have at least 500k page requests/month (excluding images and other static content, of course), you will change your mind about where (and how) to store data.

> But, I never finished it, because I use sessions so rarely. Most
> usages of it don't match well with the web's stateless design -
> if someone opens a link on your site in a new window, can be
> browse both windows independently?

  Web is stateless as long as you have static content only. What about web-commerce, shopping, applications like Google AdWords, web-mail etc? How can you handle it?

  But what do you mean by "independently"? Sure you can browse both, just any changes in session state (like, adding something to shopping cart) will be propagated to all windows eventually.

> Changing one *usually* shouldn't change the other.

  Sorry? Do you ever do some shopping online? ;) If I have many windows open, with different items, I *expect* that all of them will go into *one* shopping cart - "usually" :)

> Another big exception is something like a shopping cart. In cases like that, I prefer to use the database to store the cart anyway.

  Ah, here we are. Sure, you will need to store into DB - but only *if* there are changes. For any data which is not changing, but requested quite often, each access to the database will slow things down.

> In web.d, the session functions automatically check IP address and user agent
> as well as cookies. It can still be hijacked in some places, but it's a little harder.

  This is exactly what well-designed application and libraries do.

> To prevent hijacking in all situations, https is a required part of the solution, and the cgi library can't
> force that unilaterally. Well, maybe it could, but it'd suck.)

  It *can* enforce, by refusing non-encrypted connection, or redirecting to https when access is done by http.

> The downside is I *believe* it doesn't scale to massiveness. Then
> again, most our sites aren't massive anyway, so does it matter?

  Most of our are massive enough, so - it does matter :)

> Finally, long running processes can't be updated. You have to kill them and restart, but if their state is only in memory, this means
> you lose data.

  Not really - process should flush any dirty data to persistent storage and quit, so new copy may catch on.

/Alexander


More information about the Digitalmars-d-learn mailing list