我在尝试将大型CSV文件导入localhost上的
mysql时遇到了麻烦.
CSV大约是55 MB,大约有750,000行.
我重写了脚本,以便它解析CSV并逐个转储行.
这是代码:
$row = 1; if (($handle = fopen("postal_codes.csv","r")) !== FALSE) { while (($data = fgetcsv($handle,1000,",")) !== FALSE) { $num = count($data); $row++; for ($c=0; $c < $num; $c++) { $arr = explode('|',$data[$c]); $postcode = MysqL_real_escape_string($arr[1]); $city_name = MysqL_real_escape_string($arr[2]); $city_slug = MysqL_real_escape_string(toAscii($city_name)); $prov_name = MysqL_real_escape_string($arr[3]); $prov_slug = MysqL_real_escape_string(toAscii($prov_name)); $prov_abbr = MysqL_real_escape_string($arr[4]); $lat = MysqL_real_escape_string($arr[6]); $lng = MysqL_real_escape_string($arr[7]); MysqL_query("insert into cities (`postcode`,`city_name`,`city_slug`,`prov_name`,`prov_slug`,`prov_abbr`,`lat`,`lng`) values ('$postcode','$city_name','$city_slug','$prov_name','$prov_slug','$prov_abbr','$lat','$lng')") or die(MysqL_error()); } } fclose($handle); }
问题是它需要永远执行.任何suuggested解决方案将不胜感激.
你正在重新发明轮子.查看MysqL附带的
mysqlimport工具.它是导入CSV数据文件的有效工具.
原文链接:https://www.f2er.com/php/137707.htmlMysqLimport是LOAD DATA LOCAL INFILE sql语句的命令行界面.
两者应该比逐行执行INSERT快10-20倍.