Davi Carvalho work email
- Valid
Davi Carvalho personal email
I'm a Data Engineer passionate about using data engineering and BI to unlock hidden insights. I love the feeling of transforming raw data, the world's most valuable resource, into actionable intelligence that drives real business impact! I have 3+ years of experience in a data consulting firm, and my expertise lies in batch processing using Microsoft Azure tools and SQL. Inside Azure's environment, I perform ETL/ELT processes using Databricks, Synapse, and Data Factory. When working with Spark in Databricks and Synapse, I can use both Python and SQL APIs. I also have a background in BI, building Power BI reports, performing data modeling, and creating metrics with DAX. Looking forward to connecting with passionate data folks and exploring new opportunities!
-
Data EngineerAubayLisbon, Portugal -
Jr Data EngineerKumulus May 2024 - PresentCampinas, São Paulo, Brazil- Working on a project for a Brazilian judicial state government agency. The goal is to create a database centralizing information about an individual named in a lawsuit, such as the vehicles and companies they possess, their debts, their tax information, and if they work for the government. We cross nine governmental databases and perform the ELT using Python in Azure Synapse Analytics. My role on the project is to create the ELT with the help of another Data Engineer. -
Data Engineering AnalystKumulus Jun 2023 - May 2024Campinas, São Paulo, Brasil- Worked on a project for a Brazilian judicial state government agency. The goal is to create a database centralizing information about an individual named in a lawsuit, such as the vehicles and companies they possess, their debts, their tax information, and if they work for the government. We cross nine governmental databases and perform the ELT using Python in Azure Synapse Analytics. My role on the project is to create the ELT with the help of another Data Engineer.- Worked on a project for one of the largest Cloud, Data & IA services companies in Latin America. The goal was to gather information from seven different departments, centralize it in one Data Warehouse ( Azure SQL Database), and use it to feed Power BI reports. I built the ELT using Azure Data Factory and Azure Databricks (both Python and SQL APIs). My role was to create the ELT process, the Data Warehouse, and the Data Modeling. As a result of the project, reports that took from 15 days to one month to be manually compiled are now daily updated for the board of directors.- Worked on the support team, where I provided support for BI and Data Engineering projects developed by Kumulus. The scope of my activities involved fixing bugs and improving these projects. Our team was responsible for six clients: a pharmaceutical company, a law firm, a mobile operator, a software company, an automotive parts and services store, and an IT consulting company. We used the following technologies: Azure Data Factory, Python in Azure Databricks, Azure SQL Server, Azure Analysis Services, Power BI, DAX language, SSIS, and SSRS. -
Data AnalystKumulus May 2022 - Jun 2023Campinas, São Paulo, Brazil- Worked on the support team, where I provided support for BI and Data Engineering projects developed by Kumulus. The scope of my activities involved fixing bugs and improving these projects. Our team was responsible for six clients: a pharmaceutical company, a law firm, a mobile operator, a software company, an automotive parts and services store, and an IT consulting company. We used the following technologies: Azure Data Factory, Python in Azure Databricks, Azure SQL Server, Azure Analysis Services, Power BI, DAX language, SSIS, and SSRS. Once, a client detected bugs in five SSRS reports, and no one could solve them for two months. Once I was assigned to handle the issue, it was solved in two weeks. - Developed a project for one of the biggest pharmaceutical companies in Brazil. The goal was to compile in one Power BI report all sales information. To achieve that, I created an ETL process in Azure Data Factory that extracted data from Oracle JDE databases and Excel files, treated it, and stored it in a Data Warehouse (Azure SQL database). Then, data was consumed by Azure Analysis Services, which fed the Power BI report. I was responsible for the end-to-end development of the solution, including the ETL, the reports, and the DAX metrics. This project was a tremendous time saver for the company since the report was manually compiled and sent via email daily by one of the employees. Furthermore, now the firm accesses hourly updated information, instead of daily.- Developed another project for the same pharmaceutical company. I created a process that scales Azure Analysis Services servers and Azure SQL Databases via Data Factory. With this process, the client can change the tier of the resource during a pipeline execution for a programmed amount of time. This solution saves the department approximately R$ 15.000,00 (US$ 3.000,00) per month. -
Data AnalystSaleslogics Jan 2022 - May 2022Campinas, São Paulo, Brazil- Worked on the development of a Self-Service BI platform that would offer customers an overview of key metrics of their businesses, such as sell-in, sell-out, sales quotas, and information on sales incentive campaigns. I was assigned to work on improvements to the sales incentive campaign dashboards for one of the biggest pharmaceutical companies in Brazil. Those improvements included changes in Data Factory pipelines, SQL objects (tables, views, procedures), Azure Analysis Services models, Power BI dashboards, and DAX metrics.- Took a 64-hour online course offered by Data Science Academy on "Big Data Real-Time Analytics with Python and Spark"
-
Data Engineering InternKumulus Jan 2021 - Jan 2022Campinas, São Paulo, Brasil- Went through a two-month training on SQL, ETL, and Data Modeling- Helped in a project for one of the biggest and most respected law firms in Latin America. The project consisted of extracting accounting data from SAP, TOTVS, and Excel files to a Data Warehouse (SQL database). The ETL was performed using SSIS running inside a Virtual Machine. Then, the data was consumed by an Azure Analysis Services server, which fed Power BI reports. The final goal was to create BI reports showing the firm's Profit-and-loss account. My role was to assist in the maintenance of the SSIS pipelines, the DW, and the Analysis Services server, by making improvements and fixing bugs. My greatest contribution to the project was to fix a major flaw where, if one of the expected Excel files was not in the folder, the whole process would fail. For that, I created a process in SSIS that verified the existence of the files and, if a file was not there, the process would just move on with past data. The process would also move the read files to a historic directory, with subfolders divided by the execution date.- Helped in a project for a big Brazilian automotive parts and services store, with over 120 stores spread across 14 states. The client had dozens of Power BI reports importing data directly from Oracle databases. Kumulus created an Azure Analysis Services model, so my job was to change the data source of these reports to the AAS server and recreate all the configurations that broke when we changed the source (metrics, bookmarks, etc).- Conquered the Azure DP-900 certification, which certifies knowledge of core dataconcepts and related Microsoft Azure data services.- Took a 54-hour online course offered by Udemy on "Python Fundamentals for Data Analysis" -
Scientific Initiation StudentDevice Research Laboratory - Unicamp Apr 2018 - Sep 2019Campinas, São Paulo, BrasilWorked on a project titled: "Development and Fabrication of V-Grooves on silicon for auto-alignment of optical fibers in photonic devices.". The project consisted of developing a practical way to align optical fibers to photonic devices. For this, chemical processes were used to fabricate grooves on silicon wafers.
Davi Carvalho Skills
Davi Carvalho Education Details
-
Database Technology -
Engineering Physics
Frequently Asked Questions about Davi Carvalho
What company does Davi Carvalho work for?
Davi Carvalho works for Aubay
What is Davi Carvalho's role at the current company?
Davi Carvalho's current role is Data Engineer.
What is Davi Carvalho's email address?
Davi Carvalho's email address is da****@****.com.br
What schools did Davi Carvalho attend?
Davi Carvalho attended Impacta Tecnologia, Universidade Estadual De Campinas.
What skills is Davi Carvalho known for?
Davi Carvalho has skills like Python, Microsoft Excel, Machine Learning, Statistics, Mysql.
Who are Davi Carvalho's colleagues?
Davi Carvalho's colleagues are Daniele Barraco, Giovanni Agasso, Attilio Palmieri, Yohann Jouanno, Vanessa Stchogoleff🍓, Inès Vahé, Anna Lisa Caraffa.
Not the Davi Carvalho you were looking for?
-
Davi Carvalho
Diretor Do Departamento De Gestão Da Informação Do Ministério Do Desenvolvimento SocialFederal District, Brazil2gmail.com, cidadania.gov.br1 +556181XXXXXX
-
2gmail.com, oracle.com
-
-
1truechange.com.br
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial