Aspiring Data Engineer | Building Scalable Data Pipelines & Architectures
I am passionate about the "under the hood" of data—designing, building, and optimizing systems that make data accessible and reliable. Currently, I am focused on mastering the tools that power modern data infrastructure.
Technical Toolkit:
Languages: Python (Advanced scripting), SQL (Complex queries & Schema design).
Data Engineering Core: Understanding of ETL/ELT processes, Data Warehousing concepts, and Data Integration.
Tools & Technologies: Working with Databases (PostgreSQL/NoSQL), and exploring Cloud fundamentals (AWS/Azure/GCP) and Big Data frameworks.
What I’m working on:
I enjoy solving the puzzles of data architecture—from cleaning messy datasets to automating workflows that ensure data quality and integrity. My goal is to bridge the gap between raw data and impactful insights by building robust foundations.
I am actively seeking internships or junior roles where I can apply my technical skills to real-world data challenges and contribute to building efficient data ecosystems.
🚀 Open to collaborations and networking!
Keywords: Data Engineering, ETL, Python, SQL, Data Pipelines, Big Data, Database Management.