Summary
Overview
Work History
Education
Skills
Websites
Certification
Accomplishments
Timeline
Generic

Mauricio Fraga de Andrade

Canoas

Summary

Data Engineer with over 10 years of experience in data pipelines, data warehousing, and analytics platforms. Proven ability to design and optimize ETL/ELT workflows, manage large-scale data processing, and implement modern data stack tools like DBT, Airflow, BigQuery, and Snowflake. Strong background in BI and performance tuning of data flows. Passionate about automation, data quality, and efficient data modeling using star and snowflake schemas.

Overview

26
26
years of professional experience
1
1
Certification

Work History

Data Engineer

Involves
06.2022 - Current
  • ETL Pipeline Standardization - Pentaho PDI

Built a robust ETL pipeline in Pentaho PDI, parameterized with database-driven variables.

Enabled multi-environment execution with a single pipeline, eliminating duplicated flows.

Added automated reprocessing and modular activation/deactivation via control tables.

Established a new development standard, improving agility, consistency, and maintenance efficiency.

  • Automated Data Validation - Python + Slack

Developed a Python script integrated with Slack for data validation.

Supported both record count validation (source vs. target) and anomaly detection (out-of-threshold values).

Allowed execution on schedule or on demand via Slack commands.

Persisted validation results for dashboard monitoring, reducing support tickets and boosting client trust in ETL outputs.

  • SellOut Data Integration - AWS S3 + Snowflake

Designed and implemented an ETL integration of client-provided SellOut data with platform data.

Automated file handling: bucket monitoring, control table logging, processing, validation, alerts, and cleanup.

Consolidated processed data into the client's existing data model and delivered daily invoice reports.

Leveraged Python, AWS S3, MySQL, Pentaho, and Snowflake to ensure reliability and automation.

Business Intelligence Analyst

Syntonic Brasil
08.2018 - 03.2022
  • Data model for Campaign Cohort

Developed a data model to represent media campaign structures, enabling cohort and seasonality analysis.

Using this model, implemented a linear regression in Tableau to forecast break-even points for media campaigns, supporting more data-driven marketing strategies.

  • Reports

Created a financial report integrating multiple data sources (Excel, databases, and internal systems) to automate the full chain of tax and levy calculations, generating a profit/loss view by service line.

This report provided the Finance team with greater agility - allowing them to focus on results instead of manual data aggregation - and gave management better visibility into service profitability.

  • Main technologies used: Pentaho PDI, Tableau, MySql, Bitbucket, Scrum, SQL

Business Intelligence Analyst

ZENVIA
03.2013 - 08.2018
  • Data Model

Designed and implemented a Star Schema data model for a client in the insurance sector.

The model was fed daily via full-dimension files and incremental fact files, and also served as the underlying data source for Tableau dashboards, improving accessibility and reporting consistency.

  • Governance/Validation

Developed ETL routines for automated data validation, integrated as mandatory steps in the daily orchestration process.

These validations increased trust in the generated data and reduced the number of support tickets that clients opened.

  • Media Campaigns

Created a data model to store user subscription history from media campaigns.

Before a new subscription was confirmed, the system validated whether the user had previously subscribed and applied predefined payment rules to decide acceptance.

This solution improved media campaign performance by 3.5%, optimizing acquisition costs and campaign ROI.

  • Main technologies used: Linux, MySql, InifiniDB, PostgreSql, AWS Redshift, Suite Pentaho (PDI/BI server), Tableau Server, Tableau Desktop, Bitbucket, Scrum, SQL

Analista Infra-Estrutura Pleno

Pure Bros Mobile
02.2007 - 01.2013
  • Main Activities
  • Analysis and identification of problems in Java applications helping to improve the production platform.
  • Analysis data to identify improvements on internal and partners flows.
  • Create bash scripts to automate routines and database update.
  • Management/answering of partner support tickets.
  • Main technologies used: Linux, Shell Script, Mysql, Ocomon

Support Analyst

Constat
03.2000 - 07.2006
  • Main Activities
  • Answering 1st and 2nd level calls in Windows environment.
  • Preparation of documentation for corporate environment systems.
  • Network management routines.
  • Identifying/solving network problems.
  • Create and keep update Tickets metrics.
  • Main technologies used: Windows, Windows Server, Ticket System (Qualitor), Microsoft Office

Education

Systems Analysis And Software Development - Systems Analysis And Software Development

Faculdade Senac De Tecnologia RS
Porto Alegre, RS
01.2021

Skills

  • SQL
  • Data Modeling
  • MySQL
  • Airflow
  • Data Build Tool (DBT)
  • Snowflake
  • Extract, Transform, Load (ETL)
  • Tableau desktop
  • Data Flow
  • Alteryx

Certification

Python Essential Training, LinkedIn, 03/21

Accomplishments

  • Built a robust ETL pipeline in Pentaho PDI, parameterized with database-driven variables.

Enabled multi-environment execution with a single pipeline, eliminating duplicated flows.

Added automated reprocessing and modular activation/deactivation via control tables.

Established a new development standard, improving agility, consistency, and maintenance efficiency.


  • Developed a Python script integrated with Slack for data validation.

Supported both record count validation (source vs. target) and anomaly detection (out-of-threshold values).

Allowed execution on schedule or on demand via Slack commands.

Persisted validation results for dashboard monitoring, reducing support tickets, and boosting client trust in ETL outputs.


  • Designed and implemented an ETL integration of client-provided SellOut data with platform data.

Automated file handling: bucket monitoring, control table logging, processing, validation, alerts, and cleanup.

Consolidated processed data into the client's existing data model and delivered daily invoice reports.

Leveraged Python, AWS S3, MySQL, Pentaho, and Snowflake to ensure reliability and automation.


  • Developed a data model to represent media campaign structures, enabling cohort and seasonality analysis.

Implemented a linear regression in Tableau to forecast break-even points for media campaigns, supporting more data-driven marketing strategies.

Timeline

Data Engineer

Involves
06.2022 - Current

Business Intelligence Analyst

Syntonic Brasil
08.2018 - 03.2022

Business Intelligence Analyst

ZENVIA
03.2013 - 08.2018

Analista Infra-Estrutura Pleno

Pure Bros Mobile
02.2007 - 01.2013

Support Analyst

Constat
03.2000 - 07.2006

Systems Analysis And Software Development - Systems Analysis And Software Development

Faculdade Senac De Tecnologia RS
Mauricio Fraga de Andrade