ETL/Data Engineer
- Industry Other
- Category IT&Telecommunication
- Location Lalitpur District, Nepal
- Expiry date Feb 13, 2026 (Expired)
Job Description
Job Title:
ETL/Data Engineer
Job Summary:
We are seeking a talented ETL/Data Engineer with strong experience in Google Cloud Platform, specifically with BigQuery and Cloud Run, to maintain and enhance our client's ETL workflows. The ideal candidate will have hands-on experience building and optimizing data pipelines on GCP, designing effective data models, and collaborating directly with stakeholders to understand and fulfill reporting requirements. This role requires someone well-rounded who can handle both the technical infrastructure work and collaborate on data model design that supports business needs. This is an opportunity to join the Maitri family early, with ample room for career growth, including potential engineering management responsibilities for the right candidate.
Job Description:
As a Data Engineer, your responsibilities will include:
- Maintaining and improving GCP-hosted Cloud Run ETL workflows that intake data and load it into BigQuery
- Collaborating with client stakeholders to understand both prototype and production environments within GCP infrastructure
- Mapping and documenting existing ETL flows and data pipelines
- Working with business teams to understand different data types, sources, and their business context
- Partnering with reporting teams to understand dashboard requirements and near-term reporting needs
- Identifying and understanding key data tables in BigQuery that support business reporting and analytics
- Designing and implementing new BigQuery table structures while maintaining backward compatibility with legacy systems
- Implementing dual-write strategies to ensure smooth transitions from legacy to new data models
- Resolving data discrepancies by comparing outputs against legacy reporting systems
- Enhancing ETL processes by adding post-processing functions to enrich data and replace reporting layer logic
- Expanding data models with additional fields based on evolving business requirements
- Assisting team members in authoring optimized queries to produce accurate results
- Documenting data models, processes, and transformations
- Ensuring data integrity and accuracy throughout the pipeline
- Collaborating closely with cross-functional teams to translate business requirements into technical solutions
Job Specification:
Required Skills and Expertise:
- Education: Bachelor's degree in Computer Science, Engineering, or a related field
- Experience: 3+ years of experience in data engineering with proven working experience in Google Cloud Platform
- GCP Expertise: Hands-on experience with BigQuery and Google Cloud Run for ETL workflow implementation
- ETL Development: Strong experience building, maintaining, and optimizing ETL pipelines in cloud environments
- SQL Skills: Advanced SQL proficiency, including complex queries, performance optimization, and BigQuery-specific features
- Programming: Strong experience in Python for building data pipelines and automation
- Data Modeling: Proven ability to design and implement efficient and scalable data models that support business reporting needs
- Communication: Excellent written and verbal communication skills with ability to work directly with clients and translate technical concepts for non-technical stakeholders
- Collaboration: Experience working with cross-functional teams including business analysts, reporting teams, and client stakeholders
- Problem-Solving: Strong analytical skills with experience troubleshooting data discrepancies and pipeline issues
- Documentation: Ability to create clear technical documentation for data flows, models, and processes
Preferred Skills:
- Experience with Looker and LookML - this would be a HUGE plus
- Experience with CI/CD pipelines for data engineering workflows
- Experience with version control systems (Git)
- Understanding of data warehouse design patterns and best practices
- Experience with real-time data processing and streaming architectures
Soft Skills:
- Strong analytical thinking and problem-solving abilities
- Excellent client-facing communication skills - this resource will work closely with client stakeholders
- Ability to understand business requirements and translate them into technical solutions
- Self-motivated with ability to work independently and manage multiple priorities
- Attention to detail and commitment to data quality
- Proactive approach to identifying and resolving potential issues
- Collaborative mindset with ability to work effectively across teams
- Continuous learning mindset to stay current with evolving technologies
Application Procedure:
- Email your application and resume to [email protected]
- Mention “ETL/Data Engineer” in the email subject.