Back to News
Advertisement
Advertisement

⚡ Community Insights

Discussion Sentiment

94% Positive

Analyzed from 671 words in the discussion.

Trending Topics

#duckdb#https#github#com#simd#data#memory#rill#where#need

Discussion (30 Comments)Read Original on HackerNews

mmarian1 minute ago
It's been a lifesaver for some analysis I had to do on 70GB of Cloudflare logs.
fzumsteinabout 3 hours ago
DuckDB also runs in Excel, by the way, via the free xlwings Lite add-in that you can install from the add-in store. It’s using the Python package and allows to write scripts, custom functions, as well as use a Jupyter-like notebook workflow.
goerchabout 2 hours ago
If you start with Excel, I'll counter with Postgres: https://github.com/duckdb/pg_duckdb. I haven't found the time to check this on one of our installation, though.
uwemaurerabout 3 hours ago
I benchmarked DuckDB 1.5.2 with the latest Java JDBC driver which now supports user defined functions. This allows very fast modifications https://sqg.dev/blog/java-duckdb-benchmark/
ramraj07about 2 hours ago
Did they finally enable full SIMD or keep insisting its okay not to have it?
goerchabout 1 hour ago
Hm, our internal benchmarking shows something like a 30x speedup compared to SQLite (https://github.com/ClickHouse/ClickBench shows an even greater speedup due to not considering cache size). Calculating back on the envelope I'd estimate 8x for multithreading and 4x for SIMD. Should we expect even more?
gigatexalabout 1 hour ago
fwiw:

"Performance Does DuckDB use SIMD? DuckDB does not use explicit SIMD (single instruction, multiple data) instructions because they greatly complicate portability and compilation. Instead, DuckDB uses implicit SIMD, where we go to great lengths to write our C++ code in such a way that the compiler can auto-generate SIMD instructions for the specific hardware. As an example why this is a good idea, it took 10 minutes to port DuckDB to the Apple Silicon architecture."

https://duckdb.org/faq

andrewstuart27 minutes ago
I found it unusable due to out of memory errors with a billion row 8 column dataset.

It needs manual tuning to avoid those errors and I couldn’t find the right incantation, nor should I need to - memory management is the job of the db, not me. Far too flakey for any production usage.

goerch9 minutes ago
That sounds like a rather serious application. Did you file an issue?
andrewstuart7 minutes ago
No, I tried Clickhouse instead, which worked without crashing or manual memory tuning.

Search the issues of the duckdb GitHub there’s at least 110 open and closed oom (out of memory) and maybe 400 to 500 that reference “memory”.

goerch3 minutes ago
Understood: SQLite is to Postgres as DuckDB is to ClickHouse.
gigatexalabout 1 hour ago
Data engineer here: I use this all the time. It's amazing. For most of the data the sizes we often deal with it's perfect.
goerchabout 1 hour ago
> For most of the data the sizes we often deal with it's perfect.

Interested here: for me it works for out of core work. Where is the limit? On a related note: do you need to handle concurrency restrictions?

gigatexalabout 1 hour ago
i must be doing something wrong but if i try a huge join on a table bigger than my ram no matter the flags or the spill-to-disk modes enabled i get crashes. im sure im doing something wrong.
goerch24 minutes ago
Hm, only anecdotal evidence, but page rank computation for Wikipedia works on my laptop (https://github.com/idesis-gmbh/WikiExperiments) where `NetworkX` fails. And it uses some joins like here: https://github.com/idesis-gmbh/WikiExperiments/blob/0b108f3f...
whalesaladabout 4 hours ago
duckdb is a generational technology innovation. insanely good ergonomics, great performance, it's awesome.
goerchabout 3 hours ago
Can confirm: together with `dbt` and `rill` I'm able do to [this](https://github.com/idesis-gmbh/GitHubExperiments/blob/master...) on my laptop.
steve_adams_86about 3 hours ago
Whoa, nice! I could see this being useful to people I work with. Do you think it would be a good setup for people who are technical but not great software developers? People who use basic R and Python for ETL and analysis, mostly.
goerchabout 3 hours ago
I'm using DuckDB in another project (on my laptop) where `NetworkX` fails due to the memory limit of 32 GB. So yes, as soon as you are doing out of core work I'd assume the combination to be quite powerful. Knowledge in SQL would be a plus, though.
rick1290about 2 hours ago
is rill open source?
goerchabout 1 hour ago
esafakabout 3 hours ago
Why did you pick rill?
goerchabout 2 hours ago
It is a educational/R&D type project. We are more of backend developers and `rill` worked fine as a rapid visualization frontend with low learning curve for us.

Edit: still realizing that I can't use markdown on HN...

meetingthrowerabout 1 hour ago
I got introduced to it by Claude the other day as I was interrogating several GB of public csv files. Seemed magical as it out them all in parquet files and transformed what I needed into the normalized sqllite for my server. Coding agents seen quite comfortable with it!
whalesaladabout 1 hour ago
claude + duckdb combo is legendary for doing quick analysis of huge datasets. every time i need to analyze a big ass csv (200mb+) or as you noted a parquet file or really anything columnar i'll tell claude, 'you have duckdb at your disposal for this' and within minutes it's all sorted (no pun intended)
steve_adams_86about 3 hours ago
I use it almost daily. Any time I benchmark changes or analyze logs, I collect the data I need as CSV and analyze it with duckdb. The flexibility and ease makes it so I find so much more interesting information. It's indispensable to me now
esafakabout 4 hours ago
Any opinions on DuckLake?
erikcw29 minutes ago
I’ve had very good experience with it last year. I used it at large scale with data that had been in iceberg previously and it worked flawlessly. It’s only improved since. Highly recommend.
denomabout 3 hours ago
Seems stable enough, they patched a bunch of things.