Data Engineer (Mid Level)
YoungInnovations Pvt. Ltd.

Data Engineer (Mid Level)

  • Industry Other
  • Category Programming/Software Development
  • Location Kathmandu, Nepal
  • Expiry date May 25, 2025 (Expired)
Job Description

We are looking for a talented Data Engineer to design and manage robust ETL pipelines for ingesting, transforming, and analyzing large-scale datasets. This role requires strong proficiency in Python, Apache Spark, AWS data tools, and a solid understanding of DevOps practices. Familiarity with Parquet, DuckDB, and Snowflake, along with experience in US healthcare data, will be considered valuable additions.


Key Responsibilities:

  • Build and maintain scalable ETL pipelines using Python, Apache Spark, Pyspark and Pandas.
  • Process and store data using Parquet, and other performance-optimized formats.
  • Use AWS S3, Apache Airflow for data ingestion, transformation, and job orchestration.
  • Integrate DuckDB for lightweight analytical workloads and in-memory querying of Parquet data.
  • Apply DevOps principles to manage CI/CD pipelines, deployment, and monitoring of data workflows.
  • Identify and test for bugs and performance bottlenecks in the ETL solution.
  • Collaborate with engineering, analytics, and business teams to ensure reliable, high-quality data delivery.
  • Bonus: Integrate pipelines with Snowflake for data warehousing needs.
  • Bonus: Work with US healthcare datasets (e.g., CPT codes, MRFs) and understand compliance needs.


Required Qualifications:

  • 3+ years of experience in data engineering or a related role.
  • Strong experience with Python, Pandas, and Apache Spark.
  • Proficiency in working with Parquet files and large-scale data transformation.
  • Practical experience with DuckDB for lightweight and fast analytics on local or cloud data.
  • Proficiency in SQL and database querying languages
  • Hands-on experience with AWS services: S3, EMR, Glue, etc.
  • Knowledge of DevOps tools and practices, including Git, CI/CD pipelines, and infrastructure automation.
  • Experience with workflow orchestration tools like Apache Airflow.
  • Strong analytical skills and experience working with large data sets
  • Proficiency in Git for version control and collaborative development workflows.


Why Join Us?

  • 5-day work week (2 full days of relaxation)
  • Team building through engaging activities and social events
  • Accidental and medical insurance Coverage 
  • Opportunities for continuous learning and professional development
  • Great work environment
  • Company-provided hot, balanced and nutritious lunch
  • Festive allowance


You can submit your job application via email at careers@yipl.com.np, or you may also visit our careers website at https://younginnovations.com.np/career#hiring



This job has been expired on 2025-05-25
Share:  

See More Opportunities like this

This job may be closed, but your next opportunity is just a click away. Check out similar positions that are still available and hiring now.