E E T T L L

Let's explore my journey through data

I've built this interactive experience to show you how I think about data. Walk through the ETL process with meβ€”extract the raw details, transform them into something meaningful, and see the complete picture.

Click the steps in your preferred order to build your pipeline or
Skip to Profile

Extract
Transform
Load
1
2
3

Extract
Transform
Load

Your raw data awaits processing. Inspect the JSON or proceed to Transform.

{ } SELECT [ ] INT FROM " " JOIN STRING WHERE ARRAY
// Source: LinkedIn, Resume, GitHub
// Status: Extracting raw data...
// Fun fact: I once optimized a query late at night that cut runtime from 4 hours to 60 seconds

{
"name": "Jason Ungheanu",
"title": "Sr. Data Engineer",
"location": "Orange County, CA",
"years_experience": 8,
"education": [
{
"degree": "MS Applied Data Science",
"school": "USC",
"status": "In Progress (2024-2026)"
},
{
"degree": "BA Economics, Statistics",
"school": "CSU Fullerton"
}
],
"certifications": [
"Deep Learning Nanodegree (Udacity, 2021)",
"Data Engineering Nanodegree (Udacity, 2019)"
],
"skills_raw": [
"SQL", "Python", "BigQuery", "Postgres",
"Azure SQL", "Looker", "dbt", "ETL",
"Fivetran", "GCP Functions", "Airbyte", "dlt",
"Django", "Airflow", "PowerShell", "Power BI",
"DevExpress", "Docker", "REST APIs", "SSRS"
],
"experience_raw": [
{ "role": "Sr. Data Engineer", "company": "Truist", "years": "2023-Present" },
{ "role": "Freelance", "company": "Remote", "years": "2020-2023" },
{ "role": "Data Engineer", "company": "BibleProject", "years": "2021-2023" },
{ "role": "BI Developer", "company": "BankDirect", "years": "2017-2021" }
],
"projects_count": 6,
"platforms_mastered": 5
}

Click each transformation below to process the data. Apply all three to continue.

CLEAN MAP FILTER CAST

Input Data

Skills (Raw)

"SQL", "Python", "BigQuery", "ETL", "Data Modeling"...

Experience (Unstructured)

{role, company, years} x 5 records

Education (Nested)

Array of degree objects


Clean

Aggregate

Enrich

Output Data

Skills (Categorized)

Waiting for transformation...

Experience (Timeline)

Waiting for transformation...

Education (Formatted)

Waiting for transformation...

INSERT COMMIT INDEX LOAD DATA
Jason Ungheanu

Profile

Jason Ungheanu

Senior Data Engineer

Results-driven Data Engineer with a strong background in data architecture, ETL pipelines, and analytics. Passionate about building scalable data systems and researching AI/ML techniques to extract insights from complex datasets. Currently pursuing a Master's in Applied Data Science with a focus on advanced data engineering, cloud architecture, and the research applications of machine learning and data science.

Stats (click to explore)

8+

Years Exp

6+

Projects

5+

Platforms

Skills

SQL Python BigQuery Postgres Azure SQL Looker dbt ETL Fivetran GCP Functions Airbyte dlt Django Airflow PowerShell Power BI DevExpress Docker REST APIs SSRS

Experience

Sr. Data Engineer

Truist (acquired BankDirect)
2023 - Present

Data Engineer

BibleProject
2021 - 2023

Freelance Data Engineer

Remote
2020 - 2023

BI Developer

BankDirect Capital Finance
2017 - 2021

Education

MS Applied Data Science

University of Southern California
2024 - 2026 (In Progress)

BA Economics, Statistics

CSU Fullerton
2009 - 2013

Certifications

Deep Learning Nanodegree

Udacity
2021

Data Engineering Nanodegree

Udacity
2019

Contact

Orange County, CA

Jasonungheanu@gmail.com

(714) 869-4602

Download Resume

Research

EOE Atlas

AI-Powered Medical Research Platform

An agentic AI platform for exploring Eosinophilic Esophagitis research. Integrates 7 specialized AI agents to analyze PubMed-indexed literature, synthesize clinical findings, and surface emerging treatment patterns through an interactive Next.js interface.

Agentic AI RAG Pipeline PubMed Indexed Next.js + TypeScript
View Live

Pipeline Executed Successfully!

Data extracted, transformed, and loaded in record time.

πŸ”„

Ah, an ELT Enthusiast!

I see you prefer to Load before Transform. Modern data warehouses like BigQuery and Snowflake have made this increasingly popular!

πŸ’‘ Fun Fact: ELT became mainstream with cloud data warehouses because they can handle transformations at scale. The "T" moved to the end when storage became cheaper than compute!

Great choice! Let's walk through the classic ETL flow.

Projects

Web Crawling...
0 elements collected

Crawl Complete!

The web crawler has finished collecting data from this page.

Elements Crawled: 0
Time Elapsed: 0s
Crawl Rate: 0 el/s
Additional Data Points:
is_twin: true
mined_crypto: true
current_fuel: "Verve Sermon roast"