Full-time
Verified
Bangalore Urban, Karnataka, India
Bachelor's Degree
4000000 -4300000 INR / year
10 -17 Yrs
Nov 05, 2024
Nov 04, 2025
Expiring
Job Description
Job Location : Bangalore , Chennai , Kolkata , Gurugram , Pune
Primary Roles and Responsibilities:
Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities
Develop and maintain data pipelines implementing ETL process, monitoring performance and advising any necessary infrastructure changes
Translate complex technical and functional requirements into detailed designs
Investigate and analyze alternative solutions to data storing, processing etc. to ensure most streamlined approaches are implemented
Serve as a mentor to junior staff by conducting technical training sessions and reviewing project outputs
Skills and Qualifications:
Strong understanding of data warehousing and data modeling techniques.
Proficient understanding of distributed computing principles - Hadoop v2, MapReduce, HDFS
Strong data engineering skills on GCP cloud platforms Airflow, Cloud Composer, Data Fusion, Data Flow, Data Proc, Big Query
Experience with building stream-processing systems, using solutions such as Storm or Spark- Streaming
Good knowledge of Big Data querying tools, such as Pig, Hive, and Impala
Experience with Spark, SQL, and Linux.
Knowledge of various ETL techniques and frameworks, such as Flume, Apache NiFi, or DBT.
Experience with various messaging systems, such as Kafka or RabbitMQ.
Good understanding of Lambda Architecture, along with its advantages and drawbacks
Key Skills
People & Technology Combined
About