Tag Archives: collective.xdv

Plone 4 dev on Lion

Introduction

Recently I upgraded to OS X Lion and here are my notes on how I got my working environment working.

XCode

First things first, you need to setup build tools (gcc, make and the like). On OS X these come as a part of XCode. Even if you had XCode installed on Snow Leaopard before upgrading to Lion, do take time and reinstall it completely. Stuff changed since the white cat and it’s safer to have your build tools up-to-date.

collective.buildout.python

Same story as with XCode. Remove your collective.buildout.python directory entirely and recreate it from scrach. It takes a while, go make yourself some nice tea.

UPDATE: In OSX 10.7.1 readline does not get built because it’s configuration script does not correctly recognizes the OS. You can fix this by appending the following two lines to the bottom of src/readline.cfg in your local checkout of collective.buildout.python and re-running buildout.

make-options =
    SHOBJ_LDFLAGS="-dynamiclib"

egg cache

If you are using a local egg cache it might get tricky. Some people report everything works fine, but several people also report that things were randomly breaking until they had removed all eggs from their egg cache and re-ran buildout. My suggestion? Clear the cache! It takes a while to download all those eggs again, but it beats losing a day chasing readline related SEGFAULTS causing Zope crashes on seemingly random requests:

Program received signal EXC_BAD_ACCESS, Could not access memory.
Reason: KERN_INVALID_ADDRESS at address: 0x0000000000000000
0x00000001131ccf09 in call_readline ()

lxml

If you are using z3c.recipe.staticlxml you are in more trouble. The latest version uses libxml2-2.6.32 and libxslt-1.1.24 and these two break in a very nasty way when you try to access a Plone site themed with Diazo. Zope crashes with no traceback. Attaching gdb to the process spits out the following error:

Program received signal EXC_BAD_ACCESS, Could not access memory.
Reason: KERN_INVALID_ADDRESS at address: 0x0000001400001c4f
[Switching to process 89119]
0x00007fff8dbcd403 in strncmp ()
(gdb) bt
#0  0x00007fff8dbcd403 in strncmp ()
#1  0x00000001088e5830 in __xmlParserInputBufferCreateFilename (URI=0x1400001c4f <Address 0x1400001c4f out of bounds>, enc=163580032) at xmlIO.c:2412
#2  0x00000001088bc951 in xmlNewInputFromFile (ctxt=0x10985e5b0, filename=0x10407fd30 "/Users/zupo/work/example.project/eggs/diazo-1.0rc3-py2.6.egg/diazo/defaults.xsl") at parserInternals.c:1462
[...snip...]

One solution is to add the following two lines to the [lxml] part of your buildout.cfg:

libxml2-url = ftp://xmlsoft.org/libxml2/libxml2-2.7.8.tar.gz
libxslt-url = ftp://xmlsoft.org/libxml2/libxslt-1.1.26.tar.gz

These tell lxml to build against newer versions of libxml2 and libxslt making those nasty errors go away.

UPDATE: You can just as well use the latest version of z3c.recipe.staticxml which includes URLs to newer versions of both libxml2 and libxslt.

The other solution is simply to not use z3c.recipe.staticlxml at all. It seems that you can now use your system’s libxml provided to you by XCode (located at /usr/include/libxml2). To do this, remove [lxml] from your buildout.cfg and re-run buildout.

Order of ‘parts’ when compiling lxml

CentOS’s repos don’t have a working version of libxslt (you need 1.1.20, repos have 1.1.17) so we need to statically compile it for collective.xdv to work.

But, there is a catch! You need to be careful about how you order your parts in your buildout.cfg.

For examle, the following buildout.cfg works perfectly fine, it downloads libxml and libxslt and compiles them in your buildout environment for your use:

[buildout]
extends = http://dist.plone.org/release/4.0/versions.cfg
parts =
  lxml
  zopepy

[zopepy]
recipe = zc.recipe.egg
interpreter = zopepy
eggs =
  lxml

[lxml]
recipe = z3c.recipe.staticlxml
egg = lxml
static-build = true

Problems start when you change the order of ‘parts’ section to be like this:

[buildout]
parts =
  zopepy
  lxml

...

In this case, zopepy wants to fetch the lxml egg and compile lxml immediately. Since the ‘lxml’ part has not been run yet, there is no libxml2 and libxslt available in your buildout and the whole thing crashes. The point being: always put the lxml part at the top of your parts list.

The fix is a one-liner and I just spent 2 days figuring it out. I wish I knew this earlier.

UPDATE: Florian Schulze pointed out that “an even better fix is to reference the lxml part from zopepy.

[buildout]
extends = http://dist.plone.org/release/4.0/versions.cfg
parts =
  lxml
  zopepy

[zopepy]
recipe = zc.recipe.egg
interpreter = zopepy
eggs =
  ${lxml:egg}

[lxml]
recipe = z3c.recipe.staticlxml
egg = lxml
static-build = true 

That way buildout always installs the lxml part first, because it installs a part before resolving any references to it. This way you actually only need zopepy in parts and can leave out lxml completely.”

Compiling lxml on 64bit CentOS

A few days ago I encountered a problem while deploying Plone 4 with collective.xdv to a CentOS cloud instance. Since CentOS’ repos were a bit out of date I needed to statically compile lxml and it’s dependencies with z3c.recipe.staticlxml. Here’s what you need to add to your buildout.cfg to do so:

parts += lxml
eggs += lxml

# ===================================================================
# For collective.xdv to work properly, we need a static build of lxml
# and it's dependencies on OS X and x86_64 Linux                     
# ===================================================================
[lxml]
recipe = z3c.recipe.staticlxml
egg = lxml
force = false
static-build = true

And here’s the kicker: on some 64-bit Linux systems compiling lxml produces an error like this at compile-time:

lxml: Building lxml ...
Building lxml version 2.2.6.
NOTE: Trying to build without Cython, pre-generated 'src/lxml/lxml.etree.c' needs to be available.
Using build configuration of libxslt 1.1.24
Building against libxml2/libxslt in one of the following directories:
  /home/production/1.1/parts/lxml/libxslt/lib
  /home/production/1.1/parts/lxml/libxml2/lib
/usr/bin/ld: /home/production/1.1/parts/lxml/libxslt/lib/libxslt.a(xslt.o): relocation R_X86_64_32 against `a local symbol' can not be used when making a shared object; recompile with -fPIC
/home/production/1.1/parts/lxml/libxslt/lib/libxslt.a: could not read symbols: Bad value
collect2: ld returned 1 exit status
error: Setup script exited with error: command 'gcc' failed with exit status 1
An error occured when trying to install lxml 2.2.6. Look above this message for any errors that were output by easy_install.
While:
  Updating lxml.
Error: Couldn't install: lxml 2.2.6

Some problem with -fPIC flag not being set or something? After googling around without much success I decided to take a look directly in trunk of z3c.recipe.staticlxml and luckily Reinout had the same problem before me and already committed a patch. So all that was needed was to pull z3c.recipe.staticlxml from trunk and the compile error went away.

UPDATE: I emailed Stephan Eletzhofer to make a new release of z3c.recipe.staticxml that would include Reinout’s patch. He responded quickly and now all you need to do is make sure you have z3x.recipe.staticxml of version of 0.7.2 or higher:

[versions]
# we need this so that -fPIC flag is set when compiling lxml on 64 bit Linux
z3c.recipe.staticlxml = 0.7.2

OS X specific .cfg for collective.xdv

The collective.xdv manual on plone.org tells you how to customize your buildout.cfg in order to be able to run collective.xdv on OS X. However if you are working in a team of developers where not everyone is using OS X, it’s nice to have your specific buildout hacks in a separate osx.cfg file.

Below is an OS X specific osx.cfg file that extends the original buildout.cfg with OS X specific hacks for collective.xdv.

[buildout]
extends = buildout.cfg
parts +=
    lxml

versions = versions

[versions]
lxml = 2.2.2

[lxml]
recipe = z3c.recipe.staticlxml
egg = lxml
force = false

Now all you need to do is run buildout with config file set as osx.cfg.

$ bin/buildout -c osx.cfg

Plone4 + collective.xdv + deco.gs

Here at NiteoWeb Ltd. we decided a few days ago that we should start trying out Plone 4 in order to be ready for the stable release. There are quite some new features that I can’t wait to start using on production sites.

Being in adventurous mood I volunteered for the job. I also wanted to give collective.xdv a go. So the first step was to grab the Plone 4 development buildout and marry it with collective.xdv buildout for Plone 3. After some trial-and-error it started working. You basically just add collective.xdv to eggs section in buildout.cfg and that’s it:

eggs =
 Plone
 plone.reload
 collective.xdv

Then I was off to code HTML template that I would use with xdv. After a few lines of CSS I remembered Limi‘s talk about deco.gs on the Plone Conference and still feeling adventurous I couldn’t refuse the temptation to position elements in the HTML template with the Deco grid system. It didn’t go as smoothly as with installing collective.xdv as there’s some lack of documentation, but the general idea is simple enough that you can grasp it rather quickly.

Having a proof-of-concept for all three components I also needed a real site to try it out. So I decided to update our old corporate website. After a rainy weekend it was ready to go live. As a beta :). Hence the warning:


This site is using a bleeding-edge Plone version. Please bare with us if something isn’t working properly. We are doing this to test the new version of Plone and help it make better for everyone.


TODO:

  • port (if necessary) ContentWellPortlets product to Plone 4 and install it to have a dinamic footer
  • port (if necessary) Scrawl blogging platform to Plone 4 and install it
  • integrate Disqus commenting platform