Posted on 06/22/2010 6:35:13 PM PDT by antidemoncrat
For the last few days, both my computers started having trouble loading Free Republic pages, sometimes taking several minutes. I've noticed more than usual multiple postings for the same Freepers which tells me they are having the same problems. Has this been a problem with other users or it it just my problem? Thanks
i had the same problem the past few days. my wife told me today she had installed something called flipmac(?) on our mac a few days ago, she removed it and now it’s back to nermal. have you added any thing new to your computer recently?
i had the same problem the past few days. my wife told me today she had installed something called flipmac(?) on our mac a few days ago, she removed it and now it’s back to nermal. have you added any thing new to your computer recently?
The problem is that the server has to serve the entire page each time.
Most of the other sites have gone to html blocks so that the blocks are queued in a cache and don’t have to be reserved.
This enabled you to set up proxies to serve up the cached html blocks enabling the server to boost it’s performance since it is only serving the new html blocks and not whole pages.
Currently the Index and Comments are all one continuous html block that can’t be cached effectively.
well well kind kind of of nornormalmal
Hugh da man!
I should say also, with the use of html blocks, in most cases the blocks are served from the user’s local browser cache so the proxies don’t even get hit by the users.
We’ve been using a caching system for years.
it is showing up as a single html block on the user side, so the caching is only working internal to your server.
It isn’t working in downstream proxies or the user browsers, this causes you to have to reserve the entire page.
This is what we get from userland.
Raw request
GET http://www.freerepublic.com/tag/*/index HTTP/1.0
User-Agent: Opera/9.80 (Windows NT 6.0; U; en) Presto/2.5.24 Version/10.53
Host: www.freerepublic.com
Accept: text/html, application/xml;q=0.9, application/xhtml+xml, image/png, image/jpeg, image/gif, image/x-xbitmap, */*;q=0.1
Accept-Language: en-US,en;q=0.9
Accept-Charset: iso-8859-1, utf-8, utf-16, *;q=0.1
Accept-Encoding: deflate, gzip, x-gzip, identity, *;q=0
Cookie: __utmz=217263553.1277242371.2343.3.utmccn=(referral)|utmcsr=209.157.64.200|utmcct=/perl/login|utmcmd=referral; FOCUS=41522%3AHfZgoOwP4uJPAp6P5QExMZ%3A1; __utma=217263553.516295404.1232848598.1277258150.1277262665.2346; __utmc=217263553; __utmb=217263553
Cookie2: $Version=1
Pragma: no-cache
Cache-Control: no-cache
Proxy-Connection: Keep-Alive
Raw Response
HTTP/1.0 200 OK
Date: Tue, 22 Jun 2010 22:39:55 GMT
Server: Apache
Cache-Control: private
Content-Type: text/html; charset=latin1
X-Cache: MISS from zoolife
X-Cache-Lookup: MISS from zoolife:3128
Via: 1.1 zoolife:3128 (squid/2.7.STABLE7)
Connection: close
If there was an html blocks, each thread would show up as a separate html docs.
Instead what we get is a single html block.
Cache requires that these each be separate html blocks
Suck
Whatever. We use a caching system. We couldn’t run without it. The recent slowdown had nothing to do with caching. It was due to a major hardware crash.
http://www.alexa.com/siteinfo/freerepublic.com
Average Load Time for Freerepublic.com
Very Fast (1.038 Seconds), 80% of sites are slower.
What I was saying though, you can reduce your bandwidth requirement by 80% because you can use the local browser caches.
Then your pages would load that much faster.
A local cache is going to beat a server cached page any day of the week.
Yeah, well we’ve been using a caching system for years. And it does save us a lot of bandwidth. FR couldn’t deliver over a million pageviews per day without it.
FR is definitely the fastest site I log onto. As you say, you had technical problems which appear to be fixed now. thank you for doing a good job.
barbra ann
Yep been lagging here too....fine at this moment though...
My experience as well. All other sites load fast. I look at them while waiting for FR to load. Yep....Frustrating.
I am getting 100% misses on the main pages for cache.
This site has some of the best practices I follow, but what is key is the difference between a site that has many first time visitors vs people that are hammering the site all the day.
http://developer.yahoo.com/performance/rules.html
Too different designs with different visitors.
One technique is to reduce the number of these HTTP requests. That is good for first time users to a site.
The other is the reverse, where everything is broken apart and there may be 20 or more HTTP requests, using the expiration information in the header so you can utilize the local cache.
Anyways, that is what I was trying to describe.
I think Freerepublic is made up of a bunch of users that are hitting the site all day long.
same on my Mac and my Iphone. I just figured “they” are tracking us. Something is definitely amiss at the Circle K.
Sheesh. We had a hardware crash. Wasn’t the first time, won’t be the last.
Seems to be working better now
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.