4

快速產生 SQLite 資料的方式:一分鐘內產生十億筆資料

 3 years ago
source link: https://blog.gslin.org/archives/2021/07/20/10248/%e5%bf%ab%e9%80%9f%e7%94%a2%e7%94%9f-sqlite-%e8%b3%87%e6%96%99%e7%9a%84%e6%96%b9%e5%bc%8f%ef%bc%9a%e4%b8%80%e5%88%86%e9%90%98%e5%85%a7%e7%94%a2%e7%94%9f%e5%8d%81%e5%84%84%e7%ad%86%e8%b3%87%e6%96%99/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

快速產生 SQLite 資料的方式:一分鐘內產生十億筆資料

在「Towards Inserting One Billion Rows in SQLite Under A Minute」這邊看到作者想要在一分鐘內在 MBP 2019 上面寫 1B 筆資料進 SQLite,裡面有些方法還蠻值得玩一下的,這台 MBP 2019 機器的規格是:

The machine I am using is MacBook Pro, 2019 (2.4 GHz Quad Core i5, 8GB, 256GB SSD, Big Sur 11.1)

第一版是 Python 寫的,塞 10M 筆花了 15 分鐘:

In this script, I tried to insert 10M rows, one by one, in a for loop. This version took close to 15 minutes, sparked my curiosity and made me explore further to reduce the time.

加了五個 PRAGMA 的版本變成 100M 筆十分鐘:

The naive for loop version took about 10 minutes to insert 100M rows.

用批次處理則可以降到八分半:

The batched version took about 8.5 minutes to insert 100M rows.

再來是拿經典神器 PyPy 出來用,降到兩分半:

All I had to do was run my existing code, without any change, using PyPy. It worked and the speed bump was phenomenal. The batched version took only 2.5 minutes to insert 100M rows. I got close to 3.5x speed :)

接下來就是跳槽到 Rust 了,中間也有不少 tuning 相關的討論,但直接先跳到最後面好了... 最後 100M 只用了 33 秒:

I created a threaded version, where I had one writer thread that received data from a channel and four other threads which pushed data to the channel. This is the current best version which took about 32.37 seconds.

能用 PyPy 的地方還是可以考慮一下的...

Related

GTA 的啟動讀取效能問題

這件事情也已經過了一個禮拜,來整理一下發生什麼事情... 起因是 GTA Online 的遊戲開啟速度很慢,而有人一路 reverse engineering 找出問題並且解決:「How I cut GTA Online loading times by 70%」,對應的 Hacker News 討論有提到其他有趣的事情也可以看看:「How I cut GTA Online loading times by 70% (nee.lv)」。 作者的電腦不算太差,但光開啟 GTA Online 就需要六分鐘,網路上甚至有辦投票蒐集大家的等待時間,發現也有很多人反應類似的問題: 接下來就開始 reverse engineering 了,先觀察各種狀態後發現是卡在 CPU,而不是網路或 Disk I/O,然後就拿出 Luke Stackwalker 這個工具 profiling,不過因為沒有 debug symbol 幫忙 group,所以只能人工判斷後,可以看到兩個問題: 第一個問題發現效能是卡在 strlen(),而 call…

March 6, 2021

In "Computer"

檔案壓縮順序造成壓縮率的差異

Hacker News Daily 上看到「Why are tar.xz files 15x smaller when using Python's tar library compared to macOS tar?」這篇,作者問了為什麼他用 Python 的 tarfile 壓出來比起用 tar 壓出來小了 15 倍,檔案都是 JSON 檔壓成 XZ 格式: I'm compressing ~1.3 GB folders each filled with 1440 JSON files and find that there's a 15-fold difference between using…

March 15, 2021

In "Computer"

InnoDB 的 MVCC 繁忙時的效能問題

在 Facebook 上看到 Percona 的人修正了 InnoDB 的 MVCC 在繁忙時會有 的效能問題: 在 MySQL 官方的 bug tracking system 是「InnoDB's MVCC has O(N^2) behaviors」這個,可以看到給的重製範例是在 transaction 內大量塞 INSERT 進去後,另外一個 transaction 使用 secondary index 就會受到影響。 裡面也有提到「Secondary index updates make consistent reads do O(N^2) undo page lookups」,雖然修正了,但看起來跟當時實做的規劃有關?所以導致許多地方都是 ... 這個 bug 感覺是批次作業的行為?因為批次作業可能會用 transaction 包起來,一次寫入萬筆資料後再 COMMIT 進去。而這個行為很有機會觸發這個 bug,導致影響到線上的服務...

October 27, 2017

In "Computer"

a611ee8db44c8d03a20edf0bf5a71d80?s=49&d=identicon&r=gAuthor Gea-Suan LinPosted on July 20, 2021Categories Computer, Database, Library, Murmuring, Programming, SoftwareTags billion, data, insert, minute, performance, pypy, python, rust, speed, sql, sqlite

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Comment

Name *

Email *

Website

Notify me of follow-up comments by email.

Notify me of new posts by email.

Post navigation


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK