Tuesday, May 19, 2015

cURL on Windows: cURL error 60: SSL certificate problem: unable to get local issuer certificate

Problem:

While running some unit tests on a managed project, I ran across an issue from the Guzzle library.

cURL error 60: SSL certificate problem: unable to get local issuer certificate


Solution:

The latest version of cURL does not ship with a bundle of root certificates. This is due to security concerns in the past. Instead, they have you provide the root certificates that should be valid as certificate authorities. Most of us are really only concerned with the main ones likes Equifax, GlobalSign, and Verisign.

There are several ways to solve this problem, but what I decided to do was allow PHP on my system to always reference the same PEM file instead of other methods (like telling cURL not validate the peer certificate - please note this is a bad idea!).

In order to solve this problem, we need to obtain a version of this cacert.pem file that has not been corrupted. I found mine from the cURL web site which links to Mozilla.orgs CAs.

Once I had downloaded the pem file, I placed it in a common development directory on my Windows machine.

Last, we just need to tell PHP where to look for this file. Open your php.ini file and set the following property*:

openssl.cafile="C:\dev\cacert.pem"


Hope this solves some head scratching.



*This may change if you use something other than OpenSSL.

Monday, April 13, 2015

Git shows modified files but no local changes have been made

Problem:

I checked out updated my master branch to work on some hot fixes, but one I had pulled in the latest master branch, I found two files which were being tracked as modified. I didn't have any local changes to these files before. Where did these come from?





Solution:
For me this was due to git not handling case sensitive changes. Another developer had changed an upper case letter to lower case for a file name. Here is how I solved the problem.

Rename the directory in question to something else. Git should now show the files as deleted. Specifically `git rm` the directory with the old name and casing (you may need to force it) and run `git status` to be sure that only the now incorrect casing directory is staged for commit. Now commit your changes.





Manually rename the directory back to the expected format (the newest format) of it's title. Run `git status` again. If you still see one or more of those files as modified, checkout the latest version of the file. Note: If the directory is no longer around, manually create it with the correct casing and then checkout the files it has from your repo.



Now check your project state. You should have a clean state with 1 commit ahead of master.



Tuesday, March 31, 2015

Composer Update: fatal: No such remote 'composer'

Problem: 

While working on a composer library for a client, I ran into an issue where running the command:

composer update

Caused an error to be printed on the screen that read:

fatal: No such remote 'composer'



Solution:

After playing around with it for some time, I went to the web to see if I could find any solutions. I found this thread.

The solution ended up that I needed to just remove the .git folder from the library while it resided in the vendor directory. (I actually just renamed it, but that seemed to do the same thing).




Monday, March 9, 2015

Python - Unable to find vcvarsall.bat

Problem:
Working on a quick prototype using maxmind, I wanted to use my new python skills. The geoip2 module looked like a good place to start, but I had to install the maxminddb module first. Trying to install that using pip gave me the following error:

UserWarning: Unknown distribution option: "bugtrack_url"
...
error: Unable to find vcvarsall.bat


Solution:
Looking online I found a couple SO questions that were exactly the same as mine. "How to I set vcvarsall.bat?" Turns out this is because when python builds the extension is wants to build it for Windows using the same C Runtime libraries python itself was built with. Several people indicated that the solution here was to install Visual C++ 2008 Express Edition (but seeing as it is now 2014, this was somewhat difficult to find).

To install these C Runtime libraries, download Visual C++ 2008 Express Edition.

Once you have it installed, you will need to restart your machine (or do something to allow the new environment variable of VS90COMNTOOLS to take).

Now you should be able to install your new extension using pip.

WordPress Slow Page Loads - JetPack

Problem:
My wife has a WordPress blog and over time everything has been slowing down until the site pretty much stopped working all together.

Solution:
Optimizing WordPress is a very large topic with many resources online. Following a couple different web site suggestions, I decided to start with the plugins as these are known to cause issues. My wife had about 30 installed. After deactivating all the plugins, things started working much better. I installed some plugins to assist with determining the plugin(s) causing the trouble.

After enabling plugins one-at-a-time and profiling the page loading using P3, I discovered that JetPack was causing the blog to have some major slow down. Little did I know, but many modules in JetPack were enabled by default and weren't being used. In fact, page load time went from 0.10 seconds to 1.5 seconds per page.

Disable Unused JetPack Modules (or JetPack)
If you click "Learn More" you will then see a new button called Deactivate. This button will turn off this module and will help to make your blog faster. I strongly suggest choosing to deactivate only the bare minimum to reduce unnecessary resource usage. If you can, do not use the JetPack plugin. Once the developers are able to increase their code performance, it will be worth looking at again.

Wednesday, March 4, 2015

Issue Updating Vagrant 1.6.3 to 1.7.2 on Windows: Bundler reporting dependency can't be found.

Problem:

When uninstalling the old version and installing the new version of vagrant (along with the requires system restarts), I was met by this message when trying to boot up my machines.


Bundler, the underlying system used to manage Vagrant plugins,
is reporting that a plugin or its dependency can't be found.
This is usually caused by manual tampering with the 'plugins.json'
file in the Vagrant home directory. To fix this error, please
remove that file and reinstall all your plugins using `vagrant
plugin install`.


Somewhat confused, I tried running the uninstall/reinstall of Vagrant again, but nothing changed.



Solution:

Turns out that I just needed to remove the .vagrant.d folder from my Windows User directory.

For me this was: C:\Users\Mike\.vagrant.d

Then I installed Vagrant one last time. After a reboot, my virtual machines booted up again and I am back in business.

Tuesday, March 3, 2015

MySQL Efficiently Generating Millions of Random Registration Codes

Problem:

While working with a company, I decided to look into how they were generating their registration codes which were used to allow users to pre-purchase access to their platform. The programming behind this was written in PHP and utilized MySQL to store the codes.

The database structure was set up to allow for each code to be unique. The PHP code would handle each code generated one-at-a-time. First, it would generate a random number-letter combination to a set length and then it would query the database to see if such a code existed. If the query returned no results, it would then insert the new code into the table. If a match existed, it would discard the generated code and try a new one.

As you can see generating many codes using this process would be highly inefficient. This meant that we would actually performs up to two queries for each code - so for 100,000 codes, we would likely be 200,000+ queries against our database.


Solution:

Without really changing too much, we can optimize this process to running maybe 5 queries for the same 100,000 codes (and prevent an unlikely race condition when this table grew super large). First we have to think about what MySQL can do that would really save us with mass queries. MySQL can actually process a large group of inserts all at one time. The problem is that we need to then consider the duplicate codes. To handle these, we simply ignore the duplicates using MySQL INSERT IGNORE INTO ... But we still need to generate all 100,000 codes still, right? No problem, we can ask mysql how many records actually got inserted into the database, then use our original number (100,000) and subtract the successful ones. This tells us how many we still need to generate and insert into the database. To handle this we wrap the process in a loop. It looks something like this:

<?php

/**
 * Generates codes and inserts them into DB, retrying any fails
 *
 * @param int $code_count
 */
 public function add_code_batch($code_count)
 {
    $pending_codes = $code_count;
    while ($pending_codes){
        $mass_insert = array();
        for($i = 0; $i < $pending_codes; $i++){
            $code = $this->generate_code(); // @TODO: Add this
            $mass_insert[] = "('{$code}')";
        }
        // try to mass insert them all
        $this->db->query("INSERT IGNORE INTO `codes` (code) VALUES ".implode(',', $mass_insert));
        $pending_codes = $code_count - $this->db->affected_rows();
    }

}

?>

Keep in mind the above code may need to be modified to fit your framework and code structure (like including batch IDs or additional information). As you can see, we have drastically improved from the previous situation.

I'm not saying that this can't be optimized more or that it is the best possible solution, but it is a step forward.

Happy coding.


Possible things to explore for better performance:

  • Generating codes through SQL
  • How intense the code generation process is
  • Cost of storing codes in memory between generation and transaction