Saturday, November 09, 2013

Tips for Integrating a Third-Party PHP Library with MODX

What if you want to add functionality to MODX for which there’s not yet a MODX extra, but there is a third-party PHP library that does what you want? Here are a few tips which I discovered which may be of help to others. This may or may not work for you depending on the library and environment.

In many cases, you may just want to place raw PHP scripts alongside your MODX scripts to run independently of MODX. You can access MODX from those scripts as well. But if you want to call the library from within MODX, or even possibly to create your own MODX extra, then this article may be for you.

For this article, let's call the third-party library foo and the MODX extra modfoo.

Create a Transport Package

First off, let's set up a project directory and build script for the modfoo component we are building with the third-party library foo (see Creating a 3rd Party Component Build Script). In particular, it will need a build.transport.php file that has a file resolver for the core folder. If any snippets or other elements will be created, it will need to package those, along with any assets. In the Database section we will be adding a resolver to create the database tables as well.


If the library is available as a Composer package, here’s how we can add it to our project. (Note that we are not using Composer from the project's root directory since then the library wouldn't be packaged with our build script.)
  • Install Composer if it's not installed. Make sure composer can be run from the command-line.
  • Open the command-line and navigate to the modfoo project folder
  • From there, navigate to core/components/modfoo/model (create it if it does not yet exist)
  • Run:
    composer init 
  • Add the package to the composer.json file in the model directory. E.g.
        "require": {
            "bar/foo": "1.*"
  • Then run:
    composer install
  • A vendor directory will be created with the foo package. We can add it to our .gitignore file. E.g.


The third-party library may require some database tables to be created in order to work. Chances are that the library wouldn't have been built with xPDO in mind.

We'll use sections from the Developing an Extra in MODX Revolution tutorial for reference, but with some changes.

We can pretty much ignore the Making the Model section. This tries to define the schema via XML, then generate xPDO class files and database tables from that. But we don't want to generate additional class files since the library should already have its own.

We'll skip to The Schema Parsing Script section, and use the first 8 lines of the sample _build/build.schema.php file.

Then, use the library's documentation to locate the SQL statements to create the table(s). Create a $sql string variable and set it to the CREATE TABLE statements from the documentation. Then call $modx->exec($sql); E.g.

require_once dirname(__FILE__) . '/build.config.php';
include_once MODX_CORE_PATH . 'model/modx/modx.class.php';
$modx = new modX();
$modx->loadClass('transport.modPackageBuilder', '', false, true);
$modx->setLogTarget(XPDO_CLI_MODE ? 'ECHO' : 'HTML');

$sql = <<<EOT
CREATE TABLE foo_table1 ...;


$modx->log(modX::LOG_LEVEL_INFO, 'Done!');
We will do basically the same thing for _build/resolvers/resolve.tables.php (see Adding a Resolver in Part III). Replace the $manager->createObjectContainer('Doodle'); line with the $sql variable and $modx->exec($sql); lines from above. (Ideally, this duplication should be consolidated.)

if ($object->xpdo) {
    switch ($options[xPDOTransport::PACKAGE_ACTION]) {
        case xPDOTransport::ACTION_INSTALL:
            $modx =& $object->xpdo;
            $modelPath = $modx->getOption('doodles.core_path', null,
                $modx->getOption('core_path') . 'components/doodles/') . 'model/';

$sql = <<<EOT
            CREATE TABLE foo_table1 ...;


        case xPDOTransport::ACTION_UPGRADE:
return true;
Make sure we add a php resolver to our build.transport.php script for the resolvers/resolve.tables.php file. E.g.
$vehicle->resolve('php', array(
    'source' => $sources['resolvers'] . 'resolve.tables.php',
For local testing, we can run php _build/build.schema.php to create the table(s). When the package gets installed via the MODX package manager, the _build/resolvers/resolve.tables.php resolver will create the table(s).

Note: It would probably be a good idea to add some error checks. Also, if the package was previously installed, resolve.tables.php will probably need to be adjusted to create tables when upgrading if they don't already exist.

Consuming the library from our model

If we need to pass database info from our model to the library, we can accept a $modx parameter and pass $modx->pdo to the library if it accepts a PDO object. Otherwise, we can get specific connection configuration via $modx->config['dsn'], $modx->config['username'], $modx->config['password'], etc.

Creating a service using a snippet

If the third-party library can process an HTTP request and generate a response, here's how to get it to do so from a MODX snippet.

Configure a modfoo.core_path setting like in this example.

If we want to use autoloading with our model, add a psr-0 section to the model/composer.json and re-run composer install. (Note that if we want to use psr-0 style autoloading for the model, our model's class files should not contain a .class suffix like the MODX code standards suggest.)

Then, add the following near the top of our snippet:
$corePath = $modx->getOption(modfoo.core_path', null,
    $modx->getOption('core_path') . 'components/modfoo/');
require_once $corePath . '/model/vendor/autoload.php';
Otherwise, do the above and add a require statement for each of our own model files we need.

If the library uses echo to generate output, we can capture the output in a variable by wrapping calls to the library like the following:


$output = ob_get_contents();

return $output;
If output content type should be something other than HTML, we can add the following before the return statement:
$modx->resource->ContentType->set('mime_type', 'desired/mimetype');
E.g. for JSON output:
$modx->resource->ContentType->set('mime_type', 'application/json');
Note: I'm choosing not to set the content type from the resource settings because then it adds a file extension which is not always desirable.

Finally, create a new resource and set the content to [[!modfooSnippet]] .

Make sure the snippet is included in the transport package build script.

In closing, I'm sure these tips have room for improvement. But, these are some of the things I encountered recently that did not seem well documented.

I realize this article is not an all-in-one, comprehensive tutorial/walkthrough, but that's why I named it "tips". Feel free to develop your own tutorial based on these tips if you feel inspired to do so. If you do, let me know and I can link to it.


I'm aware that I'm not doing everything the "MODX way" in this article. There are other ways to achieve the same goals, but these are some things that worked for me. If you prefer another way, feel free to comment.

Minor fixes/rewording.

Saturday, June 29, 2013

DOM MutationObserver Performance

TL;DR: jsPerf tests: DOM MutationObserver vs. CSS Animation vs. Mutation Events and DOM MutationObserver vs. CSS Animation vs. Mutation Events 2. The results are not as expected.

In developing a browser extension, I discovered that AJAX on modern web sites can sometimes make things tricky. I figured it would be nice to detect when certain parts of the page (DOM) were updated in order to interact with those parts.

The first thing I found was the DOMNodeInserted event (one of the DOM Mutation Events). But several articles mentioned that it kills page performance. And, according to Mozilla, it's deprecated. So I tried to find another solution.

The next thing I found was a way to detect node insertions using CSS animations, strangely enough. According to the article it performs better than DOMNodeInserted. But, it still seemed a bit hackish to me, so I kept searching.

Then, I discovered Mutation Observers. According to the few articles I read, it should have better performance than the deprecated mutation events. And to me it seems like a more consistent way of doing things than using CSS animation detection. Now, browser support is still limited, but since I would be using it for a browser extension, that was ok with me.

So I was curious, how does the performance of these three ways of detecting changes to the DOM actually compare? I tried a quick search but didn't find anything. Then I remembered I had recently stumbled on to some jsPerf tests that compared performance of different ways of doing things in JavaScript.

So I created my own jsPerf test: DOM MutationObserver vs. CSS Animation vs. Mutation Events

It ought to work in browsers that support MutationObserver or WebKitMutationObserver (Chrome 18+, Firefox 14+, IE 11+, Safari 6+ as of when this article was written).

MutationObserver seems to be fastest in my test runs. I expected that CSS animation detection would perform better. I don't know how closely this test correlates to real-world performance, but it's a start, and I'm open to suggestions to improve it.

On a side node, with the browser extension I was developing, I ended up going with a different strategy altogether (at least for now), which avoided the need to detect DOM changes at all. I may write another article about that at some point. But I may need mutation observers sometime in the future.

Update: I fixed an issue in my first jsPerf teardown methods and added a second jsPerf test case and now the results are different and not what I expected. They seem to vary a lot, and surprisingly, DOMNodeInserted is actually fastest in several cases. In most cases, CSS animation events are slowest. Does that mean there's something wrong with the way I'm testing? Could mutation observers and CSS animation events really be slower than advertised relative to mutation events? I doubt that's the case and I'm not convinced my two jsPerf test cases prove anything. But I would be interested to know what I am missing and to see a more accurate performance comparison between these means of detecting DOM updates.

Update 2: I think that the main performance difference between mutation observers vs. mutation events is not so much the performance of the events themselves, but the effect on available CPU for other tasks that need to run in the meantime. So perhaps a better test might be to measure the performance impact on a secondary task that runs at the same time. But I have not yet attempted such a test.