Tuesday, March 31, 2015

Composer Update: fatal: No such remote 'composer'


While working on a composer library for a client, I ran into an issue where running the command:

composer update

Caused an error to be printed on the screen that read:

fatal: No such remote 'composer'


After playing around with it for some time, I went to the web to see if I could find any solutions. I found this thread.

The solution ended up that I needed to just remove the .git folder from the library while it resided in the vendor directory. (I actually just renamed it, but that seemed to do the same thing).

Monday, March 9, 2015

Python - Unable to find vcvarsall.bat

Working on a quick prototype using maxmind, I wanted to use my new python skills. The geoip2 module looked like a good place to start, but I had to install the maxminddb module first. Trying to install that using pip gave me the following error:

UserWarning: Unknown distribution option: "bugtrack_url"
error: Unable to find vcvarsall.bat

Looking online I found a couple SO questions that were exactly the same as mine. "How to I set vcvarsall.bat?" Turns out this is because when python builds the extension is wants to build it for Windows using the same C Runtime libraries python itself was built with. Several people indicated that the solution here was to install Visual C++ 2008 Express Edition (but seeing as it is now 2014, this was somewhat difficult to find).

To install these C Runtime libraries, download Visual C++ 2008 Express Edition.

Once you have it installed, you will need to restart your machine (or do something to allow the new environment variable of VS90COMNTOOLS to take).

Now you should be able to install your new extension using pip.

WordPress Slow Page Loads - JetPack

My wife has a WordPress blog and over time everything has been slowing down until the site pretty much stopped working all together.

Optimizing WordPress is a very large topic with many resources online. Following a couple different web site suggestions, I decided to start with the plugins as these are known to cause issues. My wife had about 30 installed. After deactivating all the plugins, things started working much better. I installed some plugins to assist with determining the plugin(s) causing the trouble.

After enabling plugins one-at-a-time and profiling the page loading using P3, I discovered that JetPack was causing the blog to have some major slow down. Little did I know, but many modules in JetPack were enabled by default and weren't being used. In fact, page load time went from 0.10 seconds to 1.5 seconds per page.

Disable Unused JetPack Modules (or JetPack)
If you click "Learn More" you will then see a new button called Deactivate. This button will turn off this module and will help to make your blog faster. I strongly suggest choosing to deactivate only the bare minimum to reduce unnecessary resource usage. If you can, do not use the JetPack plugin. Once the developers are able to increase their code performance, it will be worth looking at again.

Wednesday, March 4, 2015

Issue Updating Vagrant 1.6.3 to 1.7.2 on Windows: Bundler reporting dependency can't be found.


When uninstalling the old version and installing the new version of vagrant (along with the requires system restarts), I was met by this message when trying to boot up my machines.

Bundler, the underlying system used to manage Vagrant plugins,
is reporting that a plugin or its dependency can't be found.
This is usually caused by manual tampering with the 'plugins.json'
file in the Vagrant home directory. To fix this error, please
remove that file and reinstall all your plugins using `vagrant
plugin install`.

Somewhat confused, I tried running the uninstall/reinstall of Vagrant again, but nothing changed.


Turns out that I just needed to remove the .vagrant.d folder from my Windows User directory.

For me this was: C:\Users\Mike\.vagrant.d

Then I installed Vagrant one last time. After a reboot, my virtual machines booted up again and I am back in business.

Tuesday, March 3, 2015

MySQL Efficiently Generating Millions of Random Registration Codes


While working with a company, I decided to look into how they were generating their registration codes which were used to allow users to pre-purchase access to their platform. The programming behind this was written in PHP and utilized MySQL to store the codes.

The database structure was set up to allow for each code to be unique. The PHP code would handle each code generated one-at-a-time. First, it would generate a random number-letter combination to a set length and then it would query the database to see if such a code existed. If the query returned no results, it would then insert the new code into the table. If a match existed, it would discard the generated code and try a new one.

As you can see generating many codes using this process would be highly inefficient. This meant that we would actually performs up to two queries for each code - so for 100,000 codes, we would likely be 200,000+ queries against our database.


Without really changing too much, we can optimize this process to running maybe 5 queries for the same 100,000 codes (and prevent an unlikely race condition when this table grew super large). First we have to think about what MySQL can do that would really save us with mass queries. MySQL can actually process a large group of inserts all at one time. The problem is that we need to then consider the duplicate codes. To handle these, we simply ignore the duplicates using MySQL INSERT IGNORE INTO ... But we still need to generate all 100,000 codes still, right? No problem, we can ask mysql how many records actually got inserted into the database, then use our original number (100,000) and subtract the successful ones. This tells us how many we still need to generate and insert into the database. To handle this we wrap the process in a loop. It looks something like this:


 * Generates codes and inserts them into DB, retrying any fails
 * @param int $code_count
 public function add_code_batch($code_count)
    $pending_codes = $code_count;
    while ($pending_codes){
        $mass_insert = array();
        for($i = 0; $i < $pending_codes; $i++){
            $code = $this->generate_code(); // @TODO: Add this
            $mass_insert[] = "('{$code}')";
        // try to mass insert them all
        $this->db->query("INSERT IGNORE INTO `codes` (code) VALUES ".implode(',', $mass_insert));
        $pending_codes = $code_count - $this->db->affected_rows();



Keep in mind the above code may need to be modified to fit your framework and code structure (like including batch IDs or additional information). As you can see, we have drastically improved from the previous situation.

I'm not saying that this can't be optimized more or that it is the best possible solution, but it is a step forward.

Happy coding.

Possible things to explore for better performance:

  • Generating codes through SQL
  • How intense the code generation process is
  • Cost of storing codes in memory between generation and transaction