DATA ENGINEER
- 2
- Ahmedabad
- 5 - 10 years
- B.E, BCA, MCA, IT
- Full-Time, Remote
About the Job
We are a leading data solutions company that specializes in designing and implementing scalable data pipelines and storage solutions using GCP and AWS technologies. We are passionate about enabling businesses to harness the power of data through cutting-edge cloud platforms, delivering insights in real-time, and creating high-performing, secure data systems.
Job Description
We are looking for an experienced Data Engineer to join our team. This is a remote role, offering flexibility while working with the latest technologies in data engineering. You will be responsible for building scalable data pipelines, implementing ETL processes, and collaborating with cross-functional teams to ensure data availability and quality across platforms.
Responsibilities
- Data Pipeline Development: Design, build, and maintain scalable data processing systems on GCP and AWS platforms.
- ETL & Data Streaming: Develop and optimize ETL processes and real-time data streaming pipelines.
- Data Quality: Work closely with data scientists and analysts to ensure the accuracy, consistency, and availability of data.
- Security & Compliance: Implement security practices to ensure data is stored and handled securely, meeting industry compliance standards.
- Technology Optimization: Continuously optimize and improve data pipeline performance to support business needs and scalability.
Qualification
- Experience: Minimum 5 years of experience in data engineering.
- Technical Expertise:
- Strong proficiency in GCP and AWS services (e.g., BigQuery, S3, Dataflow, Redshift).
- Proven experience with data modeling, ETL processes, and building data warehousing solutions.
- Expertise in building scalable data pipelines and implementing real-time data streaming solutions.
- Strong programming skills in Python, Java, or Scala.
- Education: Bachelor’s degree in Computer Science, Engineering, or a related field.
- Skills & Attributes:
- Strong understanding of cloud-based data platforms and services (GCP, AWS).
- Experience building data architectures that scale effectively across cloud platforms.
- Knowledge of security practices in data engineering, including encryption, data privacy, and compliance.
- Ability to troubleshoot, debug, and resolve complex data pipeline issues.
- Collaborative mindset with the ability to work closely with data scientists and analysts.
- Proactive and creative problem-solving approach with a focus on innovative solutions.
- Detail-oriented with a strong focus on ensuring the integrity and security of data throughout its lifecycle.
- Ability to adapt to evolving technologies and changing project requirements.
- Proficient in English (both written and verbal) for effective communication with global teams.