Dynamic pages, stuff like java and flash, I know would be a limitation, maybe even a few work arounds but the majority of this stuff, I agree, would not be able to be obtained and ran offline. Some flash programs serve as a medium to load stuff off a server. Take Flash Flash Revolution for an example, when the site went down last year, it is possible to download all 2gigs of the engine, songs, charts, and play it offline.
However, there are some things like game faq pages, wiki pages, and other various applications where I can see it benefiting me especially in some places of my internet connection where dropped internet connections wouldn't be a burden anymore.
---
It would be possible to create an AHK script to copy entire dynamic webpages and store it in a file. But, how would the AHK script know what to look for, this is where ideas could come into place.
I create an autohotkey script, along with a thunderbird application, so I can send a text message to a certain email in a very specific format. The autohot key script would run some macros to insert an address I put into google maps, and send the directions to my cell.
QUOTE said:
EDIT: Reading your edit, there certainly are offline dumps of wikipedia and the like, but you would still only be able to do it by specific webpages at a time and even then could only spider the links there. Having it amass everything on something like "Microbiology" constantly in the background while you work would be insanely impractical. There's a reason Google has massive server farms that do nothing but spider the web all day.
Impractical today in some examples, it still has practical applications. It was mainly used as an example vs trying to establish a detail.
It would work like a giant RSS feed, you list which websites to feed off of, and it checks it and maybe compares it.
Thanks Rydian for that link, that is pretty much what I was looking for. Something similar to the but with combinations of a nice GUI, active transfer details, and an implementation with a browser.