How To Dynamic Web Rendering?

Adam D. Ruppe destructionator at gmail.com
Mon May 16 09:25:10 PDT 2011


Alexander wrote:
> I don't know how many visitors your websites have, but if you have
> several visits per second - you will feel it.

Two notes here: #1 several visits per second means over 5 million
views a month. That's actually very rare.

The way I do optimizations is I write it just however comes to mind
first (which usually isn't half bad, if I do say so myself), then
watch it under testing.

If it proves to be a problem in real world use, then I'll start
watching the times. IE9 has things built in to watch on the client
side. D has stuff built in to watch the server side code. I
sometimes just sprinkle writefln(timestamp) too for quick stuff.

I'm sometimes right about where the problem was, and sometimes
quite surprised. It's those latter cases that justify this
strategy - it keeps me from barking up the wrong tree.

(except once, a week or so ago. I was tracking down a bug that
I believed to be in druntime. So I recompiled druntime in debug
mode.

At that point, it was 1am, so I went to bed. When I woke up the
next day... I forgot it was still in debug mode.

So I went to the site, and was annoyed to see it took almost 30
seconds to load one of the features. That's unacceptable. It was
a new feature with some fairly complex code, so it didn't occur
to me to check druntime.

I did the profiling and found a few slow parts in dom.d. Fixed
them up and got a 20x speedup.

But, it was still taking several seconds. Then, I remembered about
druntime! Recompiled it in the proper release mode, and it was
back to the speed I'm supposed to get - single digit milliseconds.

Yeah, it's still 20x faster than it was, but 40 ms isn't /too/
bad...

Oh well though, while it was unnecessary, the profiling strategy
did still give good returns for the effort put in!)


I've had problems with database queries being slow, but they have
always been either complex queries or poorly designed tables. I've
never had a problem with reading or writing out sessions. Another
profiling story with the db query, but I'll spare you this time.

Then again, I only write sessions once when a use logs in, and
reads are quick. So even if the system was slow, it wouldn't matter
much.


Fun fact: the fastest page load is no page load. Client side caching
is built into http... but it seems to be rarely used. If you mix
code and data, you're screwed - you can't write out an HTTP header
after output has been generated!

My cgi.d now includes more than just the header() function though.
It keeps an updatable cache time, that you can set on each
function, and is output as late as possible.

That means you can set caching on the individual function level,
making it easy to manage, and still get a good result for the
page as a whole.

Caching static content is obviously a plus. Caching dynamic content
can be a plus too. I've gotten some nice usability speed boosts by
adding the appropriate cache headers to ajax calls. Even expiring
in just a few minutes in the future is nice for users. (Of course,
the fastest javascript is also no javascript... but that's another
story.)

> But what do you mean by "independently"?

Let me give an example. I briefly attended college with a "web app"
that was just a barely functional front end to an old COBOL app
on the backend.

The entire site was dependent on that flow - it kept a lot of
state server side. If you click add class in a new window, then
go to view schedule in the current window... both windows will fail.

The view schedule will error out because that's not a proper add
class command. Add class window will fail because it errored out
in the other window.


The thing was a godawful mess to use. You have to do it linearly,
in just one window, every time. Ugh.


If it were better designed, each individual page would contain
all the state it needed, so each tab works without affecting the
other.



This is an extreme example, and there's some times where a little
bit of sharing is appropriate. But, even then, you should ask
yourself is server side sessions are really the right thing to do.

> *snip*
> Ah, here we are.

Yeah, you didn't have to do a line by line breakdown for something
I discussed in the following paragraph.

Though, I will say handling things like adwords and webmail don't
need sessions beside logging in and maybe cookies, and if you use
them, your site is poorly designed.

Database access vs a session cache is another thing you'd profile.
I suspect you'd be surprised - database engine authors spend a lot
of time making sure their engine does fast reads, and frequently
used tables will be in a RAM cache.

> It *can* enforce, by refusing non-encrypted connection, or
> redirecting to https when access is done by http.

My point is it requires server setup too, like buying and
installing a certificate. You can't *just* redirect and have to work.

> Not really - process should flush any dirty data to persistent
> storage and quit, so new copy may catch on.

Indeed - you're using some kind of database anyway, so the advantage
of the long running process is diminished.


More information about the Digitalmars-d-learn mailing list