Senior Data
Engineer
for hire.
I help companies build reliable, scalable data infrastructure — end-to-end pipelines, cloud data warehouses, DataOps, and the engineering foundations that make your data actually trustworthy.
6+
Years of experience
4
Cloud platforms
40%
Avg. pipeline speedup
PT/Remote
Based in Lisbon
// what I do
Services
I work with companies that need senior data engineering expertise — whether that's building something from scratch, fixing what's broken, or scaling what exists.
Data Platform Architecture
Design and build cloud-native data platforms from scratch — ingestion, transformation, storage, and serving layers — that scale with your business.
Data Warehouse Design
Dimensional modeling, data vault, and star/snowflake schemas that power your BI and analytics. Built for performance, clarity, and maintainability.
ELT/ETL Pipeline Engineering
End-to-end pipeline development — from raw source ingestion to curated, analytics-ready datasets. Orchestrated, tested, monitored, and SLA-driven.
DataOps & CI/CD for Data
Bring software engineering best practices to your data workflows — automated testing, data quality checks, CI/CD pipelines, and monitoring & alerting.
Infrastructure as Code
Reproducible, version-controlled cloud infrastructure for your data platform. No more snowflake environments — every deployment is consistent and auditable.
Fractional Data Lead
Embedded senior leadership for small teams — architecture decisions, code review, hiring guidance, and DataOps culture — without the cost of a full-time hire.
// proof of work
Results
Measurable impact from real engagements — not estimates.
40%
Pipeline runtime reduction
Redesigned cloud data pipelines at Carpe Data
30%
Processing performance gain
Scalable pipeline architecture improvements
25%
Data accuracy improvement
ETL pipeline quality overhaul
30%
Data load time reduction
Query optimisation and batch tuning
What I've delivered
- Built end-to-end ELT platforms processing unstructured invoice data into analytics-ready datasets
- Designed dimensional data models powering BI dashboards and product analytics across multiple teams
- Standardised infrastructure with Terraform enabling one-click, reproducible environment deployments
- Shipped CI/CD for data workflows, cutting deployment risk and accelerating release cycles
- Supported enterprise customers at Microsoft running mission-critical workloads on Databricks & Azure
// tools of the trade
Tech Stack
The platforms and tools I use day-to-day to build production data systems.
Cloud & Warehouses
Orchestration & Pipelines
Languages
DataOps & Infrastructure
Modeling & Analytics
// career history
Experience
- – Designed and deployed end-to-end ELT pipelines transforming unstructured invoice data into curated analytics datasets
- – Built automated ingestion, validation, and enrichment workflows with orchestration, testing, and monitoring
- – Implemented CI/CD for data pipelines, improving release frequency and reliability
- – Architected scalable cloud data pipelines — 30% performance gain, 40% runtime reduction
- – Designed dimensional data models powering BI dashboards and product analytics
- – Standardised infrastructure with Terraform for reproducible, automated deployments
- – Implemented CI/CD for data workflows enabling automated testing and faster releases
- – Mentored data engineers and led DataOps and agile delivery best practices
- – Built a cloud-based data warehouse and analytical layer for BI and self-service analytics
- – Developed optimised ETL pipelines — 25% accuracy improvement, 30% load time reduction
- – Implemented data quality validation and performance tuning for large-scale datasets
- – Supported enterprise customers running production platforms on Databricks, Cosmos DB, and Azure Data Explorer
- – Resolved performance, scalability, and reliability issues for mission-critical data workloads
- – Advised on architecture, data partitioning strategies, and query optimisation
- – Implemented distributed data processing pipelines using Spark and Azure Data Factory
// who I am
About
I'm a Senior Data Engineer with 6+ years of experience helping companies turn raw data into reliable, business-ready assets. I specialise in building the infrastructure that makes data trustworthy — pipelines, warehouses, quality frameworks, and the DevOps practices that keep everything running.
My background spans cloud data platforms at scale (Microsoft enterprise support), fast-paced startup environments, and consulting — so I'm comfortable both in the weeds of a broken pipeline and in the room where architecture decisions are made.
I hold an MSc in Data Science & Advanced Analytics from NOVA IMS, Lisbon, and a BSc in Business Management from the University of Turin — which means I understand both the engineering and the business context behind data problems.
Education
MSc — Data Science & Advanced Analytics
NOVA IMS · Lisbon
BSc — Business Management
University of Turin · Turin
// let's work together
Book a free call
Got a data challenge you're not sure how to solve? Building something new and need a senior engineer? Let's talk — 30 minutes, no commitment, just a conversation about what you're trying to build.
Prefer email? piero.maggi3@gmail.com
Fast response
I reply within 24 hours
No fluff
Just a straight conversation about your problem
Remote-ready
EU timezone, async-friendly