Skilled technology professional with experience in Data Engineering, Backend Development, Java, Python, and SQL. Proficient in building data pipelines, databases, and backend systems for analytics and scalable applications. Experienced in ETL processes, SQL queries, RESTful APIs, and real-time data processing. Strong understanding of data structures, algorithms, and software architecture. Passionate about creating efficient, reliable, and impactful software and data solutions.
Theni, Tamilnadu, India
ethamuvelitbnec@gmail.com
+91 9384207038
About Me
» Proven track record in project management, process improvement and complex problem-solving, with a strong focus on quality, safety and innovation. Adept at collaborating with cross-functional teams and utilizing excellent communication skills to optimize project outcomes.
» Self-motivated, organized, and efficient data engineer and analyst with relevant technology experience and data processing skills. Data-driven and analytical innovator with excellent problem-solving skills, business acumen, and passion for contributing to large-scale data ingestion and research initiatives.
Education
Cultus Education
In progress a course on Amazon Web Services (AWS) covering core cloud concepts, EC2, S3, RDS, and IAM. Gained hands-on experience in deploying applications, managing cloud storage, and setting up basic networking and security. Built a strong foundation in cloud computing, scalability, and cost-optimization principles.
Samcore Solution
Strong understanding of Java basics: OOP concepts, classes, inheritance, and exception handling. Used Java for writing clean, modular, and efficient code. Familiar with data structures and algorithms in Java. Basic experience with file handling, multithreading, and JDBC.
Samcore Solution
Proficient in Python programming with a focus on data manipulation, automation, and processing large datasets. Worked with basic concepts like functions, loops, OOP, and file handling. Used libraries like Pandas and NumPy for data cleaning and analysis. Familiar with writing Python scripts for handling large datasets.
Bharath Niketan Engineering College, Anna University
Studied core subjects like data structures, databases, and distributed systems.Gained knowledge in data handling, storage, and basic big data concepts.Completed projects using Python, SQL, and data analysis tools. Built a strong base in programming (Java, Python) useful for data engineering roles.
Experience
Samcore Solution, Trichy, Tamilnadu
Strong understanding of Java basics which OOP concepts, classes, inheritance, and exception handling. Used Java for writing clean, modular, and efficient code. Familiar with data structures and algorithms in Java. Basic experience with file handling, multithreading, and JDBC. Able to use Java for backend logic and data-related tasks.
Samcore Solution, Trichy, Tamilnadu
Completed a Python internship focused on data handling and automation tasks. Worked on real-time projects involving data cleaning, analysis, and basic reporting. Used Python libraries like Pandas, NumPy, and Matplotlib for processing and visualizing data. Gained hands-on experience in writing Python scripts for file operations, data conversion, and simple ETL processes. Improved problem-solving and debugging skills through daily coding tasks and reviews.
Skills
Python
Java
Database
Scala
Linux
Apache Spark
Apache Airflow
AWS Services
Kafka
Snowflake
Azure
GCP
My Services
End-to-End Data Pipeline Development
I design and implement scalable ETL/ELT pipelines using tools like Apache Airflow, Spark, and Python, enabling seamless ingestion, transformation, and loading of structured and unstructured data.
Cloud Data Engineering Solutions
I architect and manage data workflows on AWS (S3, Redshift, Glue, Lambda) and Azure Data Factory, ensuring cost-efficient, secure, and reliable cloud-based data infrastructure.
I work with Apache Spark and Hadoop ecosystems to process and analyze large datasets, improving data availability and query performance for business intelligence.
I build and optimize relational (PostgreSQL, MySQL) and NoSQL (MongoDB) databases, applying indexing, partitioning, and query tuning techniques to ensure efficiency at scale.
Portfolio
Designed and implemented a real-time data pipeline using Apache Kafka. Ingested streaming data from external APIs (e.g., stock market/tweets). Processed and stored transformed data in PostgreSQL for analytics. Improved data availability for real-time monitoring and reporting.
Improved data retrieval speed by {25%} by designing a star schema for structured business data. Built ETL workflows to move raw data from AWS S3 into Redshift. Optimized queries using distribution keys and sort keys. Enabled faster analytical reporting and BI integration.
Increased data processing efficiency by {30%} using PySpark for analyzing {2 million} e-commerce transactions. Performed data cleaning, transformations, and aggregations at scale. Stored results in HDFS/PostgreSQL for further reporting.
Optimized data processing efficiency by {30%} using Apache Airflow to develop a scheduled ETL pipeline. Automated daily ingestion of data from a REST API into PostgreSQL. Applied Python scripts for data cleaning and transformation. Reduced manual effort by 100% through workflow automation.
Contact Me
Theni, Tamilnadu, India
ethamuvelitbnec@gmail.com
+91 9384207038