How to export lots of wordpress posts to text document?

604 views Asked by At

I need to export all posts from certain category (that contains thousands posts) to a text document. And then someone will make corrections and changes in this document, and after this I have to enter all the updated posts to WP.
So I decided that the best way is to make a XML document (by this way it could be easy to enter the posts back).
So my code is:

require_once(dirname(__FILE__) . '/wp-blog-header.php');

$counter = 0;
$recorded = array();
$double=0;

$handle = fopen("all_posts.xml", "w");
fwrite($handle, "<all_posts>" . "\r\n"); // the root XML tag

// get all the categories from the global category  
$global_cat = get_categories(array("child_of"=>5, 'pad_counts'=>true, 'hierarchical' => 
false));

 foreach($global_cat as $child_cat){

global $post;
$args = array('numberposts' => 50000,'cat' => $child_cat->cat_ID);
print_r($child_cat); echo "<br>" . $counter ."<br>";
$q_posts = get_posts($args);
foreach($q_posts as $post){
    setup_postdata($post);

    if( in_array($post->ID, $recorded ) ) {continue;}
    $recorded[] = $post->ID;    

    $counter++;
    $title = get_the_title();
    $cur_categories = get_the_category();
    $cur_tags = get_the_tags();
    $d = get_the_date();
    $cont = get_the_content();

    fwrite($handle, "<post>" . "\r\n");

    fwrite($handle, "<title>" . $title . "</title>" . "\r\n");
    fwrite($handle, "<id>" . $post->ID . "</id>" . "\r\n");
    fwrite($handle, "<cur_cat>" . $child_cat->name . "</cur_cat>" . "\r\n");

    fwrite($handle, "<categories>\r\n");
        foreach ($cur_categories as $cat) {
            fwrite($handle, "<cat>" . $cat->cat_name . "</cat>");
        }               
    fwrite($handle, "\r\n</categories>" . "\r\n");

    fwrite($handle, "<tags>\r\n");
        foreach ($cur_tags as $tag) {
            fwrite($handle, "<tag>" . $tag->name . "</tag>");
        }               
    fwrite($handle, "\r\n</tags>" . "\r\n");
    fwrite($handle, "<date>" . $d . "</date>\r\n");

    fwrite($handle, "<content>\r\n" . $cont . "</content>\r\n\r\n");

    fwrite($handle, "</post>" . "\r\n");


}               

}
fwrite($handle, "</all_posts>");
fclose($handle);

The problem is, that because there is something like 10,000, the server does not give responce [I think it because that the xml file become to big or because of the excessively long time of procceng php script]. Only when I try to export posts from category that has only something like 2000 posts it works well.
What is the way to fix it?

3

There are 3 answers

2
HamZa On BEST ANSWER

On the top of your code set the max_execution_time to unlimited (or for some minutes) ...

ini_set('max_execution_time', 0);

Try even to boost the memory limit used by PHP

ini_set('memory_limit', '100M');
0
Emil Vikström On

You are probably hitting the time limit for scripts. This can sometimes be changed with set_time_limit.

Otherwise you can limit it to export a few hundred pages at a time. Just change the offset option for get_posts between runs.

0
Ethan Brouwer On

Try this.

Where you have:

$args = array('numberposts' => 50000,'cat' => $child_cat->cat_ID);

put

$args = array('numberposts' => 50000,'cat' => $child_cat->cat_ID, 'post_status' => 'publish' );

This will make sure you only have the posts that are published. Is this what you want? or do you want all of them.

If you want all of them, try putting all of the code in a while loop based on the categories and insert the category name into the name of the file therefore having multiple files but then you can still call all of them individually. This is probably your best bet since the file would be too large.

Ethan Brouwer