Skip to main contentOpteroAIBeta
Back to listings
This listing is over a month old and likely no longer available.
ES

Snowflake Data Engineer

EPAM Systems
Bangalore, IndiaonsiteEst. $1.2M (~₹10.0Cr)senior6-10 yearsOpen to all

Description

EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow. No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential. We are seeking a Snowflake Data Engineer to join our team and enhance our data solutions.The ideal candidate will be responsible for designing and maintaining efficient data structures, optimizing data storage and retrieval within Snowflake, and ensuring data integrity across various data sources. This role involves collaboration with cross-functional teams to deliver high-quality data solutions that support analytical and operational requirements.#LI-DNI#EasyApply Responsibilities Snowflake Data Modeling: Design and implement scalable Snowflake data models, optimized for data ingestion and analytics requirements ETL Pipeline Development: Build and maintain robust ETL pipelines to integrate data from multiple sources into Snowflake, ensuring data integrity and consistency Performance Optimization: Optimize Snowflake usage and storage, tuning query performance and managing data partitions to ensure quick, reliable access to data Data Security & Governance: Implement best practices in data security, role-based access control, and data masking within Snowflake to maintain compliance and data governance standards Automation & Workflow Management: Utilize tools such as dbt and Apache Airflow to schedule data processing and automate pipeline monitoring Collaboration & Troubleshooting: Partner with data scientists, business analysts, and other stakeholders to address complex data challenges and troubleshoot Snowflake-related issues effectively Documentation & Reporting: Develop comprehensive documentation for data structures, ETL workflows, and system processes to ensure transparency and knowledge sharing within the team Requirements 3 to 5 years of experience in data engineering or a related field Proficiency in Python Experience with AWS as the primary cloud provider Expertise in Snowflake modeling, data modeling using Data Vault, and the DBT framework for data transformation pipelines Knowledge of workflow management tools like Argo, Oozie, and Apache Airflow Understanding of the requirements for a scalable, secure, and high-performance data warehouse that supports integration with monitoring and observability tools We offer Opportunity to work on technical challenges that may impact across geographies Vast opportunities for self-development: online university, knowledge sharing opportunities globally, learning opportunities through external certifications Opportunity to share your ideas on international platforms Sponsored Tech Talks & Hackathons Unlimited access to LinkedIn learning solutions Possibility to relocate to any EPAM office for short and long-term projects Focused individual development Benefit package: Health benefits Retirement benefits Paid time off Flexible benefits Forums to explore beyond work passion (CSR, photography, painting, sports, etc.)

Tech stack

PythonAWS

Benefits

PTO

About Bangalore, India

Cost of living

low

Avg tech salary

12L-35L INR

Remote work

Hybrid dominant, startups offer remote

Posted 12/5/2024Source: The MuseView original listing

Want to know your chances? OpteroAI predicts your offer probability for this role based on your profile.

See your offer score

Free to start. No credit card.

Company Insights

No data yet

Culture signals and hiring data will appear as we collect more information about EPAM Systems.