![]() ![]() So I have to think of something else since I dont want to preprocess the CSV file. Since comma (,) is the fieldterminator, the 2nd record will be considered as 4 fields. Is there anything with my table DDL that would slow down an import? CREATE TABLE `some_schema`. Qualifier will be included in the table (Ex: 'Insane) 2. I have reached out to their support to try and understand what command Sequel Pro is using to load data. Let’s first create a dummy database named ‘Bar’ and try to import the CSV file into the Bar database. Sequel Pro does import the data in 30 seconds, even with the key. You can import a CSV file into a specific database. the tags table and assumed it must be the key slowing things down. I looked at the DDL for what I had been testing with vs. The reason why I suspected it was the key is that I tried to use the same tags.csv dataset that the other commenter used, but for that table (and for the same DDL that the commenter used) there was no speed difference between DataGrip and Sequel Pro. I tried to import the data without the key, but it still took about 2m40s via DataGrip. If you import the XLS file into Google Refine and then export as an CSV youll have a data format which works great with libraries such as Rubys CSV class or. I would use SQL Server Integration Services (SSIS) and load the data into staging tables, then verify/massage the data there. 16:03:39 finished - execution time: 2 m 50 s 437 ms, fetching time: 1 ms, total update count: 10000 INSERT INTO import_perf_test_tags_datagrip (user_id, email) VALUES (?, ?) 16:00:49 finished - execution time: 87 ms, fetching time: 106 ms, total result sets count: 1 SELECT t.* FROM import_perf_test_tags_datagrip t 16:00:48 finished - execution time: 171 ms Is it possible to see the exact command that DataGrip is using to load data so that I can cross reference against what my other SQL client is doing?Īgain, this is all I see in the DataGrip logs: I did do some testing on a table without any keys specified and there was no performance discrepancy. I have another MySQL client on my mac (Sequel Pro) that will import a 10,000 row file with two columns (IDs and email addresses) in about 30 seconds, whereas DataGrip takes almost 3 minutes. ![]() I am only experiencing performance issues with the import. One could easily export this data in other structured formats, such as comma-separated values (CSV), or XML for further analysis. Server ping isn't very straightforward to measure since I'm connecting via an SSH tunnel. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |