4 Technical Duh! Lessons I Learned from My Latest Data Science Project

By Medium - 2021-03-14

Description

I am all about writing technical articles ✍️ with fully working code. Seeing people benefitting from the tutorial I designed is very fulfilling. In addition to that, I started to think about writing…

Summary

  • Data Science Real-life Stories With a touch of soft skill revelations… I am all about writing technical articles ✍️ with fully working code.
  • The file consists of 58 separate sheets and more than 150k US government contracts.
  • And seeing that Plotly Express lack some of the basic design capabilities, I learned that I will always use the Graph Objects module if I am working on a real project even though it means that I will spend extra time and “even my Google searches take me all these attractive Plotly Express implementations.” The lesson I learned is that whenever I start a data visualization project, I need to find if the client may need a backend solution to offer the highest value to his user.

 

Topics

  1. Backend (0.36)
  2. Database (0.23)
  3. Frontend (0.16)

Similar Articles

What was the data approach to breaking The FinCEN Files?

By diginomica - 2020-09-24

How do you work backwards from thousands of disconnected PDFs to the 100,000 suspicious financial transactions they’re reporting on? A combination of hard human work, OCR, data analysis and graph.

Data Science Learning Roadmap for 2021

By freeCodeCamp.org - 2021-01-12

Although nothing really changes but the date, a new year fills everyone with the hope of starting things afresh. If you add in a bit of planning, some well-envisioned goals, and a learning roadmap, yo ...

Datasets should behave like git repositories

By DAGsHub Blog - 2021-01-18

Create, maintain, and contribute to a long-living dataset that will update itself automatically across projects, using git and DVC as versioning systems.

15 Essential Steps To Build Reliable Data Pipelines

By Medium - 2020-12-01

If I learned anything from working as a data engineer, it is that practically any data pipeline fails at some point. Broken connection, broken dependencies, data arriving too late, or some external…