Saturday, June 30, 2018

There's a Rift in my space-time continuum

In other words, I GOT AN OCULUS RIFT!

VR in 6 glorious degrees of freedom. It's definitely an upgrade from my Gear VR which was a lot of fun but only tracked 3 degrees of freedom. With the Rift and its Touch controllers, being able to see where my both of hands actually are in VR and even roughly my fingers is a big deal. My head movements are fully tracked, and running on a computer instead of my phone enables richer and more sophisticated immersive experiences.

I am very much enjoying experiencing virtual reality through my new system. I keep feeling a sense of wonder as I play, experiment, explore and even get a bit of exercise.

I've been a VR enthusiast since I was a teenager. I have owned a handful of VR-related components including Power Gloves, a Leap Motion sensor, Google Cardboard, and Gear VR. But I have wanted a Rift for a while and I am very excited to finally have one and dive deeper into the virtual world.

Saturday, November 09, 2013

Tips for Integrating a Third-Party PHP Library with MODX

What if you want to add functionality to MODX for which there’s not yet a MODX extra, but there is a third-party PHP library that does what you want? Here are a few tips which I discovered which may be of help to others. This may or may not work for you depending on the library and environment.

In many cases, you may just want to place raw PHP scripts alongside your MODX scripts to run independently of MODX. You can access MODX from those scripts as well. But if you want to call the library from within MODX, or even possibly to create your own MODX extra, then this article may be for you.

For this article, let's call the third-party library foo and the MODX extra modfoo.

Create a Transport Package

First off, let's set up a project directory and build script for the modfoo component we are building with the third-party library foo (see Creating a 3rd Party Component Build Script). In particular, it will need a build.transport.php file that has a file resolver for the core folder. If any snippets or other elements will be created, it will need to package those, along with any assets. In the Database section we will be adding a resolver to create the database tables as well.


If the library is available as a Composer package, here’s how we can add it to our project. (Note that we are not using Composer from the project's root directory since then the library wouldn't be packaged with our build script.)
  • Install Composer if it's not installed. Make sure composer can be run from the command-line.
  • Open the command-line and navigate to the modfoo project folder
  • From there, navigate to core/components/modfoo/model (create it if it does not yet exist)
  • Run:
    composer init 
  • Add the package to the composer.json file in the model directory. E.g.
        "require": {
            "bar/foo": "1.*"
  • Then run:
    composer install
  • A vendor directory will be created with the foo package. We can add it to our .gitignore file. E.g.


The third-party library may require some database tables to be created in order to work. Chances are that the library wouldn't have been built with xPDO in mind.

We'll use sections from the Developing an Extra in MODX Revolution tutorial for reference, but with some changes.

We can pretty much ignore the Making the Model section. This tries to define the schema via XML, then generate xPDO class files and database tables from that. But we don't want to generate additional class files since the library should already have its own.

We'll skip to The Schema Parsing Script section, and use the first 8 lines of the sample _build/build.schema.php file.

Then, use the library's documentation to locate the SQL statements to create the table(s). Create a $sql string variable and set it to the CREATE TABLE statements from the documentation. Then call $modx->exec($sql); E.g.

require_once dirname(__FILE__) . '/build.config.php';
include_once MODX_CORE_PATH . 'model/modx/modx.class.php';
$modx = new modX();
$modx->loadClass('transport.modPackageBuilder', '', false, true);
$modx->setLogTarget(XPDO_CLI_MODE ? 'ECHO' : 'HTML');

$sql = <<<EOT
CREATE TABLE foo_table1 ...;


$modx->log(modX::LOG_LEVEL_INFO, 'Done!');
We will do basically the same thing for _build/resolvers/resolve.tables.php (see Adding a Resolver in Part III). Replace the $manager->createObjectContainer('Doodle'); line with the $sql variable and $modx->exec($sql); lines from above. (Ideally, this duplication should be consolidated.)

if ($object->xpdo) {
    switch ($options[xPDOTransport::PACKAGE_ACTION]) {
        case xPDOTransport::ACTION_INSTALL:
            $modx =& $object->xpdo;
            $modelPath = $modx->getOption('doodles.core_path', null,
                $modx->getOption('core_path') . 'components/doodles/') . 'model/';

$sql = <<<EOT
            CREATE TABLE foo_table1 ...;


        case xPDOTransport::ACTION_UPGRADE:
return true;
Make sure we add a php resolver to our build.transport.php script for the resolvers/resolve.tables.php file. E.g.
$vehicle->resolve('php', array(
    'source' => $sources['resolvers'] . 'resolve.tables.php',
For local testing, we can run php _build/build.schema.php to create the table(s). When the package gets installed via the MODX package manager, the _build/resolvers/resolve.tables.php resolver will create the table(s).

Note: It would probably be a good idea to add some error checks. Also, if the package was previously installed, resolve.tables.php will probably need to be adjusted to create tables when upgrading if they don't already exist.

Consuming the library from our model

If we need to pass database info from our model to the library, we can accept a $modx parameter and pass $modx->pdo to the library if it accepts a PDO object. Otherwise, we can get specific connection configuration via $modx->config['dsn'], $modx->config['username'], $modx->config['password'], etc.

Creating a service using a snippet

If the third-party library can process an HTTP request and generate a response, here's how to get it to do so from a MODX snippet.

Configure a modfoo.core_path setting like in this example.

If we want to use autoloading with our model, add a psr-0 section to the model/composer.json and re-run composer install. (Note that if we want to use psr-0 style autoloading for the model, our model's class files should not contain a .class suffix like the MODX code standards suggest.)

Then, add the following near the top of our snippet:
$corePath = $modx->getOption(modfoo.core_path', null,
    $modx->getOption('core_path') . 'components/modfoo/');
require_once $corePath . '/model/vendor/autoload.php';
Otherwise, do the above and add a require statement for each of our own model files we need.

If the library uses echo to generate output, we can capture the output in a variable by wrapping calls to the library like the following:


$output = ob_get_contents();

return $output;
If output content type should be something other than HTML, we can add the following before the return statement:
$modx->resource->ContentType->set('mime_type', 'desired/mimetype');
E.g. for JSON output:
$modx->resource->ContentType->set('mime_type', 'application/json');
Note: I'm choosing not to set the content type from the resource settings because then it adds a file extension which is not always desirable.

Finally, create a new resource and set the content to [[!modfooSnippet]] .

Make sure the snippet is included in the transport package build script.

In closing, I'm sure these tips have room for improvement. But, these are some of the things I encountered recently that did not seem well documented.

I realize this article is not an all-in-one, comprehensive tutorial/walkthrough, but that's why I named it "tips". Feel free to develop your own tutorial based on these tips if you feel inspired to do so. If you do, let me know and I can link to it.


I'm aware that I'm not doing everything the "MODX way" in this article. There are other ways to achieve the same goals, but these are some things that worked for me. If you prefer another way, feel free to comment.

Minor fixes/rewording.

Saturday, June 29, 2013

DOM MutationObserver Performance

TL;DR: jsPerf tests: DOM MutationObserver vs. CSS Animation vs. Mutation Events and DOM MutationObserver vs. CSS Animation vs. Mutation Events 2. The results are not as expected.

In developing a browser extension, I discovered that AJAX on modern web sites can sometimes make things tricky. I figured it would be nice to detect when certain parts of the page (DOM) were updated in order to interact with those parts.

The first thing I found was the DOMNodeInserted event (one of the DOM Mutation Events). But several articles mentioned that it kills page performance. And, according to Mozilla, it's deprecated. So I tried to find another solution.

The next thing I found was a way to detect node insertions using CSS animations, strangely enough. According to the article it performs better than DOMNodeInserted. But, it still seemed a bit hackish to me, so I kept searching.

Then, I discovered Mutation Observers. According to the few articles I read, it should have better performance than the deprecated mutation events. And to me it seems like a more consistent way of doing things than using CSS animation detection. Now, browser support is still limited, but since I would be using it for a browser extension, that was ok with me.

So I was curious, how does the performance of these three ways of detecting changes to the DOM actually compare? I tried a quick search but didn't find anything. Then I remembered I had recently stumbled on to some jsPerf tests that compared performance of different ways of doing things in JavaScript.

So I created my own jsPerf test: DOM MutationObserver vs. CSS Animation vs. Mutation Events

It ought to work in browsers that support MutationObserver or WebKitMutationObserver (Chrome 18+, Firefox 14+, IE 11+, Safari 6+ as of when this article was written).

MutationObserver seems to be fastest in my test runs. I expected that CSS animation detection would perform better. I don't know how closely this test correlates to real-world performance, but it's a start, and I'm open to suggestions to improve it.

On a side node, with the browser extension I was developing, I ended up going with a different strategy altogether (at least for now), which avoided the need to detect DOM changes at all. I may write another article about that at some point. But I may need mutation observers sometime in the future.

Update: I fixed an issue in my first jsPerf teardown methods and added a second jsPerf test case and now the results are different and not what I expected. They seem to vary a lot, and surprisingly, DOMNodeInserted is actually fastest in several cases. In most cases, CSS animation events are slowest. Does that mean there's something wrong with the way I'm testing? Could mutation observers and CSS animation events really be slower than advertised relative to mutation events? I doubt that's the case and I'm not convinced my two jsPerf test cases prove anything. But I would be interested to know what I am missing and to see a more accurate performance comparison between these means of detecting DOM updates.

Update 2: I think that the main performance difference between mutation observers vs. mutation events is not so much the performance of the events themselves, but the effect on available CPU for other tasks that need to run in the meantime. So perhaps a better test might be to measure the performance impact on a secondary task that runs at the same time. But I have not yet attempted such a test.

Monday, September 10, 2012

How to retrieve Domains By Proxy customer number(s) from Go Daddy using Google Chrome

Note: See Escaping GoDaddy and Domains by Proxy by Aaron James Young for a simpler method. My article is more technical and involves more steps.

If you would like, you may skip directly to the instructions.


When I attempted to transfer a domain away from Go Daddy last December, I had trouble retrieving the Domains By Proxy customer number using the Retrieve Customer Number tool at I think it was because the e-mail address on file was disabled the first time I tried using it. I then reactivated the e-mail account, but there were no messages with the customer number. My subsequent attempts to retrieve it also failed.

I was frustrated. But then I discovered the eHow article, How to Remove Domains by Proxy, which gave me hope. It didn't work for me, but it led me in the right direction. The second sentence of Step 1 says, "At the bottom of this page, you will see your Domains by Proxy user name, which is all numbers." I did not see the number the instructions were referring to. But after some investigation I found a way to locate it, which I explain below.



The instructions below describe how to search the DOM for a div with id="privacy-selected-div" and delete or disable the display: none; rule of the div’s style attribute. This will display the section: “Select your Domains By Proxy® account (Private Registration account)”. Your Domains By Proxy customer number(s) should be listed next to one or more radio button labeled “Login:”.


Follow these instructions at your own risk. The steps below involve technical steps such as viewing and updating the Document Object Model (DOM) of a web page you are viewing. Also, I documented these steps several months ago (December 2011) and Go Daddy may have changed things since then. So, it’s possible these steps may now be completely wrong.

The following instructions are for Google Chrome. If you have a different web browser, it may be possible to retrieve your customer number(s) using your browser’s development tools, but the exact instructions will vary and are up to you.

Step-by-step Instructions

  1. Open Google Chrome
  2. Log in to your Go Daddy account
  3. Under My Account click My Renewals
    • You should now be at the Go Daddy My Renewals page
  4. Open Chrome’s Developer Tools window:
    • By right-clicking anywhere on the web page (such as a margin) and selecting Inspect element
    • Or by pressing Ctrl+Shift+I
  5. In the Search Elements box, search for:
    • Make sure Elements is selected at the top of the Developer Tools window.
    • The following element should be located:
      <div id="privacy-selected-div" 
      style="display: none; …">…</div>
    • Click on the located element (e.g. click on the yellow highlighted section)
    • The line should turn blue.
  6. In the right side of the window, locate the “Styles” section.
    • If it is not expanded, click to expand it (the arrow should point down).
    • Place your mouse cursor over the following block under Styles: {
          display: none;
          margin-left: 22px;
    • Click the checkbox that appears to the right of the display: none; rule.
    • The rule should now have a line through it, like display: none; indicating the rule is disabled, and should cause the section to be displayed.
How to display privacy-selected-div using Chrome's Developer Tools
Screenshot showing how to display the privacy-selected-div using Chrome's Developer Tools.
  1. Now, switch back to the Chrome window and tab with the Go Daddy My Renewals web page.
    • You should now see the heading, “Select your Domains By Proxy® account (Private Registration account)” below the table with your domains.
    • Below this, you should see one or more radio buttons with the label “Login:” and a number.
    • The number(s) is/are your Domains By Proxy customer number(s).
Screenshot of the My Renewals page
Screenshot of the Go Daddy My Renewals page showing the Domains By Proxy customer number (highlighted at the lower left).
  1. Log out of your Go Daddy account.
    • Do NOT click Continue on the My Renewals page if you do not wish to renew your domain(s) at this time.


  • Follow these instructions at your own risk.
  • Do NOT click Continue on the My Renewals page if you do not wish to renew your domain(s) at this time. I am not responsible for any charges.
  • Go Daddy may change their site at any time causing these instructions to fail. It worked for me when I drafted this article (December 2011). It may or may not work for you.
  • View source from your browser probably will not work because the relevant section of the page appears to be generated via JavaScript.
  • It is unlikely that I will be able to answer questions you may have. For one reason, I no longer have access to a Go Daddy account.

Other options for DOM/web ninjas

  • Get values from input elements with name="dbpaccts".
    • If you want more info, look at the corresponding label element(s).
  • It might also be possible to obtain your customer number(s) by peeking at JSON or AJAX.

See Also

Monday, April 30, 2012

A rant on time zone code usage

Or, an idea to simplify time zone abbreviations in daily usage

Time zones for North America

Say you’re checking your Twitter account while hanging out at a coffee shop on a hot summer day, and you notice the following update from your favorite musician:

@rockinmusician: Can’t make it to the San Francisco show this weekend? Watch it live online this Saturday at 7 PM PST!

Did you catch that? Does anything about the time look funny to you? Is everything dandy with that PST? Since, you know, standard time is in the summer and daylight saving time is in the winter, right? Or maybe the S stands for summer? Not quite.

I see this error way too often on social media, websites, flyers, etc. Mostly, I’ve held my tongue, but it continually annoys me, so I’m venting here.

Since rockinmusician’s concert is in the summer (Pacific Daylight Time in San Francisco), the actual time should be written 7 PM PDT. Or should it?

Should we even continue to use traditional U.S. time zone codes/abbreviations (i.e. EST/EDT, CST/CDT, MST/MDT, PST/PDT, etc.) when specifying times in day-to-day communication?

Wrong information

Why do we even bother adding the S or D in our time zone codes if we’re going to be wrong about it? I think saying the wrong thing is worse than being too general. And because it’s been communicated incorrectly so often (at least in social media), it has become unreliable. Like the boy who cried “wolf”.

It may even cause clients/fans/etc. to lose a bit of respect when they see the obvious mistake.

Useless information

Furthermore, the vast majority of the time, it adds no meaningful information to specify whether an event happens during standard or daylight saving time, even when it is communicated correctly. In many cases, standard time or daylight saving time are going to be the same between the two parties communicating, even if they are in different time zones. And when it’s not, they will usually be aware of that.

So the middle letter in the time zone code has just become noise.

I think we can simplify a bit.

Who cares?


Who cares if your event is PST or PDT? If I know it’s Pacific Time, then 99.9% of the time I don’t need you to tell me whether it’s standard or daylight saving time. So why bother even adding the S or D? Especially if it’s going to be wrong.

For the rare instances that it really matters, then yes, do use the original codes. But because the codes have been so misused, you may have to add an extra note or something to make sure people take notice.

I remember our old office voicemail menu shared the office hours like “9 AM to 5:30 PM Pacific Daylight Time.” That meant a special update to the voicemail twice each year. Ugh – so pointless. I was glad when it was simplified to “Pacific Time.” The hours listed on the website were similarly simplified, to my delight.

An alternative

For everyday communication, let’s just drop the middle letter and use shorter codes for U.S. times. So Eastern Time would be ET, etc. The original example would become 7 PM PT. Below is a table of my proposed shorter time zone codes.

Time Zone Short Code
Eastern Time Zone ET
Central Time Zone CT
Mountain Time Zone* MT
Pacific Time Zone PT
Alaska Time Zone AKT
Hawaii-Aleutian Time Zone* HAT or HT

The shorter code is slightly simpler, reduces chance for error, and its meaning should still be obvious when used next to a time.

Or, you could always spell it out, like 7 PM Pacific.


Again, I’m not saying original time zone codes should be dropped entirely. They should probably still be used in more formal publications. But if using them, make sure there is a good reason for the added complexity. And make sure it’s correct and verified by an editor (or multiple editors).

Also, some computer programs, etc. must use the full codes. But we’re human, so we can generalize when appropriate.

* Certain parts of the U.S. do not “celebrate” daylight saving time. That is one case where, for example, MST vs. MDT can matter. But people in those areas should already be aware of that. And events originating from those areas (with outside participants) should probably have an additional note making sure others are aware of the difference anyways.

I don’t know if these new codes will cause any confusion when communicating U.S. times internationally, but these are my thoughts from my own experience. It appears some codes already conflict internationally, so I don’t think using shorter codes would cause additional confusion. If you have similar frustrations regarding time zone codes in your region, feel free to adapt this idea.

Saturday, October 02, 2010

On a Journey

I have begun a new phase of my journey. I am beginning an internship at the International House of Prayer in Kansas City (IHOP-KC). During the internship I will have the opportunity to learn more about God and grow closer to Him in an environment of 24/7 prayer and worship.

I first visited IHOP-KC in December 2008 while visiting friends in Kansas City. I was in a dry season spiritually and was more interested in visiting my friends than the prayer room. But I did go and God greatly ministered to me. I got set free from some of my struggles and shame. And I fell in love with Jesus again.

God has continued to work in me since then. I visited IHOP-KC multiple times and went to a couple events and God continued to minister to me. After praying, I felt led to spend a longer season pursuing God in Kansas City through an internship.

Would you consider partnering with me through a special gift during this journey? Any amount you can give will be appreciated. Thank you!

Note: Gifts are not tax-deductible.

Tuesday, September 21, 2010

Time for a transition

I'm moving to Kansas City.

For the past six years I've worked at the Every Nation Ministries office in Nashville and Los Angeles. It's been amazing and I will miss my coworkers who made working there a joy. But for the past couple years I have been feeling that it's time for a transition.

And that time is now. Well, next week, actually.

I plan to spend this next season at the International House of Prayer in Kansas City. Over the past couple years I visited a few times and God has used IHOP-KC to breathe new life into my relationship with Him. I am excited to spend this next season there.

I feel like I have already learned and grown so much this year. I'm going to miss Nashville and all my friends there, but I am in love with God and full of faith. My eyes are on Him and I know He's got me in His hands.