Back up forum's contents?
Back up forum's contents?
This is a question for Joe Strout or other admins:
Is there any way I can get a dump of the posts on the site?
I think the information here is valuable and should be available in some sort of archive. PHPBB uses a database backend, so a few SQL queries should do it.
I have no intention of setting up a second site. I'm just talking about geographically distributed backups. And if the answer is no, thanks anyway.
Is there any way I can get a dump of the posts on the site?
I think the information here is valuable and should be available in some sort of archive. PHPBB uses a database backend, so a few SQL queries should do it.
I have no intention of setting up a second site. I'm just talking about geographically distributed backups. And if the answer is no, thanks anyway.
I also hope that this site is backed up. Backups (in form of regularly created comprimated SQL dumps) distributed on servers around the world would be great. I can offer you space on my server for backups (server is in Czech Republic).
"Those who would give up Essential Liberty to purchase a little Temporary Safety, deserve neither Liberty nor Safety."
-- Benjamin Franklin
-- Benjamin Franklin
dch24 i was thinking the exact same thing !
1) I dont keep a journal or a blog. My posts here are the only record of my thoughts and ideas. Thus i would really like a copy just in case.
2) As i said to Dr Nebel, my thoughts arent "is this going to work" but "whats going to happen next. Thus i would really like a copy just in case.
So whats the best way to do this ? I think exporting pure BB code might be a bit extreme but i would proudly keep a copy. It would be nice if it could appear in a browser like we are seeing it now. I know there are few programs that are designed to save websites. Any suggestions guys ?
1) I dont keep a journal or a blog. My posts here are the only record of my thoughts and ideas. Thus i would really like a copy just in case.
2) As i said to Dr Nebel, my thoughts arent "is this going to work" but "whats going to happen next. Thus i would really like a copy just in case.
So whats the best way to do this ? I think exporting pure BB code might be a bit extreme but i would proudly keep a copy. It would be nice if it could appear in a browser like we are seeing it now. I know there are few programs that are designed to save websites. Any suggestions guys ?
Purity is Power
Thanks, Joe
There are easy ways to control the "spidering" hit. For wget:
--quota to limit the amount downloaded
--wait to limit how fast requests hit the server
--limit-rate very rough, but on large files, it can limit network bandwidth
So if anyone feels like the server is being abused, please just say something. I'm currently sending requests from 64.38.220.4.
There are easy ways to control the "spidering" hit. For wget:
--quota to limit the amount downloaded
--wait to limit how fast requests hit the server
--limit-rate very rough, but on large files, it can limit network bandwidth
So if anyone feels like the server is being abused, please just say something. I'm currently sending requests from 64.38.220.4.
Joe will be best informed.
However I'll post back here if I notice any slow down in posting.
The best time would be from Midnight USA Central Time to about 8 AM GMT from what I have seen of traffic patterns. That is about a 3 hour window.
However I'll post back here if I notice any slow down in posting.
The best time would be from Midnight USA Central Time to about 8 AM GMT from what I have seen of traffic patterns. That is about a 3 hour window.
Engineering is the art of making what you want from what you can get at a profit.
I ran the following command:
Then I compressed the output (1.1 GB) down to 63 MB and placed it at: http://polywell.nfshost.com/2008_05_13_ ... ks.tar.bz2
Since bandwidth and storage cost me, I'll leave it posted for a week. Keegan, Mikos, feel free to download a copy or ask for more time.
I tagged it "nooutsidelinks" because I think I can write a shell script to identify outside links (such as posts with pictures, PDFs, etc.) and add them to the next download. MSimon, I will be sure to run them after 5:00 AM GMT.
Does anyone mind if the next download is larger than 63 MB?
Code: Select all
wget -nv -EpkKm http://www.talk-polywell.org/bb/index.php
Since bandwidth and storage cost me, I'll leave it posted for a week. Keegan, Mikos, feel free to download a copy or ask for more time.
I tagged it "nooutsidelinks" because I think I can write a shell script to identify outside links (such as posts with pictures, PDFs, etc.) and add them to the next download. MSimon, I will be sure to run them after 5:00 AM GMT.
Does anyone mind if the next download is larger than 63 MB?
If you open an account here:
http://www.mediafire.com/
You should be able to park files up to 100 MB for no charge.
http://www.mediafire.com/
You should be able to park files up to 100 MB for no charge.
Engineering is the art of making what you want from what you can get at a profit.
I don't know what Joe thinks but every two weeks or so would be better.dch24 wrote:Hi derg, did you get a chance to download the backup while I had it posted?
I will post another one in a few months and leave it posted for a week or so.
Engineering is the art of making what you want from what you can get at a profit.