Zend Server installation potentially kills your SSH

Zend Server installation on your server can potentially kill your server, as in you cannot SSH into it anymore. SSH gets suicidal:

Aug 31 11:12:54 xxx kernel: sshd[4246]: segfault at 7fff237db000 ip 7f871b7bfc84 sp 7fff237d93d0 error 6 in sshd[7f871b77a000+61000]

SSH’ing to localhost from a local console can give more information:

ssh: /usr/local/zend/lib/libcrypto.so.0.9.8: no version information available (required by ssh)
ssh_exchange_identification: Connection closed by remote host

You don’t see the error reason above unless you have a local console available. So, when realizing what has happened might make you panic for few seconds.

However after getting into a local console you can cure Zend with fire:

cd /usr/local/
rm -rf zend

Then kill all processes related to zend

 ps -ef | grep zend | awk '{ print $2 }' | xargs kill

Restart SSH to make it load a working libcrypto from /usr/lib

/etc/init.d/ssh restart

… and never look back to Zend Server.

“Tested” on Ubuntu 8.04 Linux.

\"\" Subscribe to RSS feed Follow me on Twitter Follow me on Facebook Follow me Google+

Automatically removing old items from a Plone site

Below is an advanced version for old item date based deletion code which issuitable for huge sites. This snippet is from Products.feedfeeder package. It will look for Feedfeeder items (automatically generated from RSS) which are older than X days and delete them.

It’s based on Zope 3 page registration (sidenote: I noticed that views do not need to be based on BrowserView page class).

  • Transaction thresholds make sure the code runs faster
  • Logging to Plone event log files
  • Number of days to look into past is not hardcoded
  • Manage rights needed to execute the code

You can call this view like:

http://localhost:9999/plonecommunity/@@feed-mega-cleanup?days=90

… hook it to Zope clock server or run as crond job.

Here is the view Python source code:

import logging

import transaction
from zope import interface
from zope import component
import DateTime
import zExceptions

logger = logging.getLogger("feedfeeder")

class MegaClean(object):
    """ Clean-up old feed items by deleting them on the site.

    This is intended to be called from cron weekly.
    """

    def __init__(self, context, request):
        self.context = context
        self.request = request

    def clean(self, days, transaction_threshold=100):
        """ Perform the clean-up by looking old objects and deleting them.

        Commit ZODB transaction for every N objects to that commit buffer does not grow
        too long (timewise, memory wise).

        @param days: if item has been created before than this many days ago it is deleted

        @param transaction_threshold: How often we commit - for every nth item
        """

        logger.info("Beginning feed clean up process")

        context = self.context.aq_inner
        count = 0

        # DateTime deltas are days as floating points
        end = DateTime.DateTime() - days
        start = DateTime.DateTime(2000, 1,1)

        date_range_query = { 'query':(start,end), 'range': 'min:max'}

        items = context.portal_catalog.queryCatalog({"portal_type":"FeedFeederItem",
                                             "created" : date_range_query,
                                             "sort_on" : "created"
                                            })

        items = list(items)

        logger.info("Found %d items to be purged" % len(items))

        for b in items:
            count += 1
            obj = b.getObject()
            logger.info("Deleting:" + obj.absolute_url() + " " + str(obj.created()))
            obj.aq_parent.manage_delObjects([obj.getId()])

            if count % transaction_threshold == 0:
                # Prevent transaction becoming too large (memory buffer)
                # by committing now and then
                logger.info("Committing transaction")
                transaction.commit()

        msg = "Total %d items removed" % count
        logger.info(msg)

        return msg

    def __call__(self):

        days = self.request.form.get("days", None)
        if not days:
            raise zExceptions.InternalError("Bad input. Please give days=60 as HTTP GET query parameter")

        days = int(days)

        return self.clean(days)

Then we have the view ZCML registration:

<page
    name="feed-mega-cleanup"
    for="Products.CMFCore.interfaces.ISiteRoot"
    permission="cmf.ManagePortal"
    class=".feed.MegaClean"
    />

\"\" Subscribe to RSS feed Follow me on Twitter Follow me on Facebook Follow me Google+

Restructured text on-line edit and preview: rst.ninjs.org

Python uses restructured text mark-up a lot in the code documentation. It’s a lightweight (read: easy to type) mark-up for formatting your technical documentation. There is a special twist: reST is very readable in plain-text source code format too. This make it very suitable for source code comments based API document generation.

Sphinx documentation suite uses restructured text internally and pypi.python.org package descriptions are writtein in reST.

Here is a very good on-line service by Andrey Rublev which allows you to type in restructured text and see the results in real-time. For example, I prepare many blog posts offline, convert them to HTML with rst.ninjs.org and then paste them to WordPress HTML source view.

The service source code is available on GitHub.

(note: HTML link syntax in reST is horrible – could we create something smarter for this as this is one of the most often used feature?)

\"\" Subscribe to RSS feed Follow me on Twitter Follow me on Facebook Follow me Google+

Feedburner, Planet Venus and categorized posts

This post gives some insight for blog owners how to tune WordPress & Feedburner when  posting their posts to various open source planets / aggregation services.

1. Preface

Google’s Feedburner is a popular feed subscriber statistics service for blog owners. Planet Venus is popular feed aggregation service software used by many open source projects to create a website of gathered RSS feeds (an example).

WordPress is a popular blogging software and has a plug-in which will automatically enable feedburner statistics for all the feeds using HTTP redirects. For a blogger using the plug-in means painless set-up of Feedburner statistics on his/her blog.

2. Problem

Usually planets (the aggregation services) are only interest blog posts of a certain category (it this example let’s call the category “plone”). WordPress enables categorized RSS 2.0 feeds using URLs of the following syntax

http://opensourcehacker.com/category/plone/feed/

Then this URL is put into the configs of Planet Venus and Planet Venus starts aggregating posts of a certain category from the source blog.

Now, if the blog is feeding out both categorized posts AND is using a feedburner a problem rises. The feedburner redirects at the feedburner.google.com might not properly handle categorized feeds. Instead, you’ll get an error page saying “looks like your computer might be doing too much automated requests” or something along the line.

3. Solution

Luckily there is an easy fix. In the WordPress Feedburner plug-in settings tick the following option:

Settings -> Feedburner ->

This should allow the aggregators to gather the categorized posts and still use Feedburner stats for the users who subscribe the main RSS feed or your blog.

 

\"\" Subscribe to RSS feed Follow me on Twitter Follow me on Facebook Follow me Google+