I wanted to quickly populate some tables using some large CSVs (400k+ records) but parsing the files and importing each line was out of the question and doing one mass insert would have used up a lot of memory. The only good solution is using MySQL's LOAD DATA functionality which can import the data within seconds. But figuring out how to do this with Laravel was not easy.
When trying to execute the query using DB::statement() I kept getting met with an exception from MySQL saying that I had unbuffered queries active. This displayed with a suggestion of using a PDOStatement. After some searching the internet and trying some things, I found a working solution.
$csv = app_path().DIRECTORY_SEPARATOR."database".DIRECTORY_SEPARATOR."resources".DIRECTORY_SEPARATOR."locations.csv";
$query = sprintf("LOAD DATA INFILE '%s' INTO TABLE locations FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' ESCAPED BY '\"' LINES TERMINATED BY '\\n' IGNORE 0 LINES (`location_id`, `country`, `region`, `city`, `postal_code`, `lat`, `lng`, `metro_code`, `area_code`)", addslashes($csv));