
Location
Noida/NagpurEmployment Type
Talent Development Programme(TDP)Experience Level
0-1 YearEducation
Bachelors or Master's degree in computer science, information technology, or a related field.Role Summary
Support the design and implementation of data architecture, including data modeling, storage, and pipeline development in Snowflake. Optimize performance, ensure data security and compliance, and collaborate with cross- functional teams to meet dynamic business data requirements.
Job Description
- Develop and maintain data pipelines that extract, transform, and load data from various sources into Snowflake. They use tools such as SQL, Python, and ETL/ELT platforms to build scalable and efficient data pipelines.
- Determine the data storage and processing requirements, design data models, and create data pipelines to ingest, transform, and load data into Snowflake
- Design and optimize Snowflake data pipelines and queries by applying best practices for data ingestion, transformation, and query performance. Continuously monitor platform performance and proactively implement improvements to enhance scalability, efficiency, and cost optimization.
- Ensure data security and regulatory compliance within Snowflake by implementing robust access controls, encryption, and governance frameworks.
- Maintain adherence to data privacy standards while safeguarding sensitive and business-critical information
- Collaborate with cross-functional teams to ensure that the data solutions meet the needs of the organization
- Professional certification of Databricks or equivalent certification is highly preferred.
Key Skills
- ETL Processes
- ELT Processes
- Databricks data platform
- Snowflake data platform
- Data Modeling
- Data Loading
- SQL
- Data Pipeline
- Python
- Pyspark
- Spark
Benefits
- Opportunity to transition into full-time role based on performance