0

Turn CSV, XML & JSON into a live analysis-ready SQL database

 1 year ago
source link: https://www.producthunt.com/posts/tablum-io
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Support is great. Feedback is even better.

"We believe that TABLUM.IO can save you hours of routine work spent on data parsing, cleansing, and ingestion into an SQL database. TABLUM.IO will turn any raw data into analysis-ready SQL tables in seconds.

We value your feedback and would like to hear your thoughts on our product."

The makers of TABLUM.IO
Sort by:

Hey, Product Hunt! 👋

I'm Greg, the CEO, and founder of TABLUM.IO. Today is a really special day for my team and me because we're finally launching our new product on PH!

TABLUM.IO is an online tool that can turn any data file, like CSV, Excel, JSON, or XML, into analysis-ready SQL tables with just a few clicks.

It's super flexible and can fetch data from an API (RESTful, SOAP), load data over HTTP, and parse, cleanse, and normalize data regardless of its original source. TABLUM.IO recognizes the content of columns, generates the corresponding SQL schema automatically, and loads the results into an internal relational database powered by ClickHouse. Ultimately, you get an analysis-ready SQL database out of raw and "dirty" data.

Once your data is loaded into TABLUM.IO, you can explore, visualize, or query it using any third-party data analytics and business intelligence software via a standard database connector. Or, you can explore and transform loaded datasets with a powerful SQL console within the TABLUM.IO UI.

Less time spent on data preparation means more time for data analysis.

Check out this quick demo on how easy it is to ingest CSV, JSON, and XML files into a live SQL database and explore them with Metabase:

We're really excited to receive feedback from you on our product, so please try it out for free at go.tablum.io and leave your comments or feedback below.

Thank you for your participation and support!

@gregz Congratulations on the launch!
@gregz it looks really interesting for data intensive projects! Will try in the couple of days!
@new_user_28684f38402aa Thank you! Feel free to reach out to me on any questions. I will demonstrate the product.
@gregz Congrats with the lunch and fingers crossed!
Great idea! So it is a hosted ClickHouse, right? What are the limitations? Are you able to update the records in the generated database? What if you upload the same csv again, will it tackle and resolve duplicates? Will it add new records to the same tables or generate new tables? Do you have any API for ingesting the data or it's only manual?

@gleb_sologub it is using ClickHouse under the hood to store ingested data and share the generated tables via standard ClickHouse interface (the one on 8443 port over http). The essential part of the solution is the way it ingests data from various sources: it takes raw and unstructured data and turn it into the normalized and cleansed SQL tables regardless of its origin.

There’s API for automation.

It is possible to update/replace/clenased already loaded data. It also allows you to download it back in various formats.

The embedded subqueries on the loaded data enable you to transform ingested data that includes aggregations, joins, duplicate removals and many other ways to manipulate data.

I'm super interested how this can compare to Firebase or existing AWS/GCP/Azure db services. Good luck with the launch!
@anna__marie Thank you for your question. TABLUM.IO is a data ingestion service to prepare raw unstructured data from files, feeds, and API responses into a relational database ready for analysis. It is not a database (in terms of data storage), but the service to prepare and place data into the storage for further usage in data analysis (build charts, generate dashboards, perform ad-hoc reporting). Thus, it does not replace the existing db services, but extends them with the ability to work with raw unstructured data. TABLUM.IO can be a part of the data pipeline workflow (it's closer to the ETL stack).

Firstly, congratulations on the launch of TABLUM.IO! 🎉 As someone who has spent countless hours on data parsing, cleansing, and ingestion, I'm excited to see a tool that aims to simplify these processes and save valuable time.

The idea of transforming raw data into analysis-ready SQL tables in seconds sounds incredibly promising. Your product could potentially revolutionize the way we handle data preparation and enable users to focus more on data analysis and insights generation.

I'm eager to give TABLUM.IO a try and see how it performs with different data formats and use cases. It would be interesting to learn more about the types of data sources TABLUM.IO supports, as well as the level of customization available for the data transformation process.

Once again, congratulations on your launch, and I'm looking forward to exploring TABLUM.IO further. I'll be sure to share my feedback and thoughts on the product as I dive in. Keep up the great work! 👍

@krzysztof_parjaszewski Thank you for your warm words. It would be amazing if you could test our solution and provide feedback.

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK