Category Archives: Development

Timezones and MySQL

Recently I presented a talk about Timezones (which I hope to present a few more times). During the preparation I stumbled over a a function that eases timezone-based datetime-calculations in MySQL: CONVERT_TZ. You can use it to convert from one timezone into a different one. So let’s say we have a database created like this:

CREATE TABLE `datetime` (
  `id` int(11) unsigned NOT NULL AUTO_INCREMENT,
  `zeit` datetime DEFAULT NULL,
  `zone` varchar(255) DEFAULT NULL,
  PRIMARY KEY (`id`)
) ENGINE=InnoDB AUTO_INCREMENT=4 DEFAULT CHARSET=utf8mb4;

INSERT INTO `datetime` (`id`, `zeit`, `zone`)
VALUES
    (1,'2014-03-04 12:23:34','Europe/Berlin'),
    (2,'2016-05-03 23:12:23','Europe/Busingen'),
    (3,'2016-05-03 23:12:23','America/Chicago');

So the dates in the table are in their respective timezones.

To check which Dates are before 14:00 UTC you can use a select-statement like this:

SELECT * FROM datetime 
    WHERE TIME(CONVERT_TZ(zeit, zone, 'UTC')) 
    < "14:00:00";

It will return something like this:

1 | 2014-03-04 12:23:34 | Europe/Berlin
3 | 2016-05-03 23:12:23 | America/Chicago

So with CONVERT_TZ you can convert a datetime (from a datetime-field) from a timezone into a different timezone. And the timezone can also be a field from the table.

The only prerequisit is that the timezone-informations have to be set up in the database. For that you’ll use the servers timezone-informations by calling mysql_tzinfo_to_sql /usr/share/zoneinfo | mysql -u root mysql.

You can find more information on that at https://dev.mysql.com/doc/refman/5.5/en/time-zone-support.html and https://dev.mysql.com/doc/refman/5.5/en/date-and-time-functions.html#function_convert-tz

Composer and self-signed certificates

Today I wanted to add a package from our internal satis-repository to a composer.json-file. Easy thing!

I added the satis-server as repo to the composer.json like this:

    {
      "repositories": [{
        "type": "composer",
        "url": "https://example.com/satis",
      }]
    }

Fair enough! That’s it! Run composer and be happy:

$ composer require vendor/package

[Composer\Downloader\TransportException]
  The "https:/example.com/satis/packages.json" file could not be downloaded: SSL operation failed with code 1. OpenSSL Error messages:
  error:14090086:SSL routines:ssl3_get_server_certificate:certificate verify failed
  Failed to enable crypto
  failed to open stream: operation failed


require [--dev] [--prefer-source] [--prefer-dist] [--no-plugins] [--no-progress] [--no-update] [--update-no-dev] [--update-with-dependencies] [--ignore-platform-reqs] [--sort-packages] [-o|--optimize-autoloader] [-a|--classmap-authoritative] [--] [<packages>]...

WTF…???

Ah, yes! The servers certificate is signed by our internal Root-CA. As it’s an internal server that’s what we do. But how to get composer to know that? It took me a while. Adding the Rot-Certificate to OpenSSL didn’t bring the expected results (might be due to Homebrew) as didn’t adding the cert to PHPs keystore (might be due to some weird setup on my machine).

But there surely has to be a way of getting satis (or Toran) up and running with self-signed certs!

There were many hacks around that disabled certificate validation altogether but that’s not what I wanted. And after a short question on Twitter I got a few ideas. Alexander Tureks idea was great: Adding “verify-peer” : “false” to the ssl-options of the repository. Sadly that didn’t do the trick. Finally Jordi Boggiano gave me a hint to a feature I hadn’t found in the composer-docs before: Add the RootCA-Certificate to the ssl-options of the repository (and to the project).

So now my composer.json looks like this:

{
  "repositories": [{
    "type": "composer",
    "url": "https://example.com/satis",
    "options" : {
      "ssl" : {
        "cafile" : "myrootca.crt"
      }
    }
  }]
}

The file “myrootca.crt” is a PEM-file that only contains the root-certificate. You can get it by calling openssl x509 -in <(openssl s_client -connect example.com:443 -prexit 2>/dev/null) > myrootca.crt. And myrootca.crt needs to be on the same level as the composer.json in your project.

Thanks Jordi for the fast response!

Hope that helps someone 😉

bootstrapping wordpress

The other day I was thinking about how to easily setup a wordpress-site for development, staging as well as production and I found that there doesn’t seem to be an easy way “out-of-the-box”.

Why is that?

WordPress is famous for it’s “5 minute installer” (which actually, after the 20th wordpress-site, takes you 2 minutes max). Upload the folder via FTP onto your server, open your webbrowser and add some basic informations and you’ve got a Blog up and running. Do some more customizations and you’ve got a decent website.

But it’s never been designed to take that site and move it to somewhere else to do some testing and then put everything back again to the live-site. Oh, but only put the configuration you changed back, not the actual data…

And that’s where it becomes tricky. The configuration resides in the same database as the content. And even though we just need the information what plugin shall be installed, we have to move the complete sourcecode of the plugin from A to B as there is no way otherwise. And to be honest, even moving the complete wordpress-core from A to B is somewhat overdone.

Dependency-Management

This can be circumvented by using DependencyManagement. the wordpress-core and all of the plugins and themes are dependencies of my website. And some examples already exist on the internet on how to setup a wordpress-site using composer. To enable that there’s even a proxy for all the plugins and themes in the wordpress-repository.

But it still misses some things.

Multiple identical setups

One thing was to allow easy installation on a development-system including a vagrant-setup while still enabling me to deploy everything easily on a different machine with a local database and webserver setup.

Version Control

Another thing is the possibility to store the setup in a VCS with a minimum amount of files. In my eyes it doesn’t make much sense to have the complete wordpress-core under version-control when a line like "wordpress" : "4.4.*" can be enough.

Automation

And it should also allow me to use the wordpress-repository to install plugins and themes as I’m used to. Update the dependencies via the webinterface or via an automated process.

An idea

Slowly an idea crept up in my mind.

And I’ve started creating it. A wordpress-bootstrap in combination with a WordPress-Composer bridge. It’s not finished yet, but I believe it’s a good way.

Now I can create a running wordpress-installation with some terminal-commands (as long as composer and vagrant are available) like this:

    composer create-project org_heigl/wordpress_bootstrap [my-wordpress-folder]
    # confirm the defaults with "Enter"
    cd [my-wordpress-folder]
    vagrant up

Now visit http://localhost:8080. You can log in using wpadmin and password (and you should change that!)

The setup already has a plugin installed that adds the plugins you activate in that installation to the composer.json-file and you can store the information of activated plugins and themes by adding that composer.json to your VCS and that’s it. No need to bloate your VCS with the complete plugin-code.

The whole thing is not yet finished! There’s a lot of stuff to do like handling Database-Transfers and stuff.

But it’s a start.

What do you think? Anything you think that should be included? Open an Issue or fork the project and create a Pull-Request.

A link to your Call for Papers

You have an upcoming conference for which you run a Call for Papers (CfP)? Then why not let everyone know by adding a link to where we can find more informations?

Something like the following could be parsed automatically and would allow easier finding of your CfP:

<link rel="cfp" href="[Link to your CfP-page]" data-closes="[ISO-Date of the end of the CfP]"/>

This could then look like this:

<link rel="cfp" href="https://example.com/cfp" data-closes="2015-12-24 12:00:00+02:00" />

That way websites displaying CfPs could fetch the information that there is a CfP from your main website and you would not need to provide extra informations about a CfP on a different place.

And while we’re at it: Your Event surely has a geographical location, so why not add that one via the following resource?

<link rel="venue" href="geo:[latitude],[longitude]"/>

It uses a geo-URI as specified in RFC5870 and can therefore also been read automatically to check for the venue of your event.

Alternatively you can include information about your event or the location using metadata as defined by schema.org. That way you can add even more informations about your event or the location than using a simple link. Thanks to Jurian Sluiman for reminding me!

Database-Testing with PHPUnit and sqlite

Today I wanted to test a database-handling class using PHPUnits PHPUnit_Extensions_Database_TestCase and an sqlite in-memory database.

On running the tests I always got the following error-messages:

MyDatabaseTest::testDatabaseConnection with data set #0 ()
PHPUnit_Extensions_Database_Operation_Exception: COMPOSITE[TRUNCATE] operation failed on query:
                DELETE FROM "[tablename]"
             using args: Array
(
)
 [SQLSTATE[HY000]: General error: 1 no such table: [tablename]]

WTF…

After some checking what might help I got different solutions:

  • Use MySQL as test-backend – No, I want to use an sqlite with in-memory storage as it’s fast and doesn’t require too much setup on a CI-Server
  • Use a pre-configured sqlite-file and therefore a file-based sqlite. That might have been possible, but I just wanted to use in-memory….

The solution was rather simple. It seems that PHPUnit deletes the content of the table in question before setting up the table to execute a test. That only works, if the table already exists. Therefore I’ve added a create table xyz statement right after setting up the in-memory-table. So the test-class now looks like that:

class MyDatabaseTest extends PHPUnit_Extensions_Database_TestCase
{
    protected $pdo = null;

    /**
     * @return PHPUnit_Extensions_Database_DB_IDatabaseConnection
     */
    public function getConnection()
    {
        if (null === $this->pdo) {
            $this->pdo = new PDO('sqlite::memory:');
            $this->pdo->exec('create table [tablename]([table-definition])');
        }
        return $this->createDefaultDBConnection($this->pdo, ':memory:');
    }

    /**
     * @return PHPUnit_Extensions_Database_DataSet_IDataSet
     */
    public function getDataSet()
    {
        return $this->createXMLDataSet('[path/to/xml-seed-file]');
    }

    public function testDatabaseConnection()
    {
        $pdo = $this->getConnection()->getConnection();
        // Do your database-tests here using the required pdo-object
    }
}

Perhaps this can help someone to find a solution earlier 😉