S & P Global, Islamabad Req ID: 301504 Qualification:
- BS in Computer Science or Engineering
Experience:
- At least 4-6 years of professional software work experience and experience with Big Data platforms such as Google Cloud Platform, Apache Hadoop and Apache Spark.
- Deep understanding of REST, good API design, and OOP principles and Experience with object-oriented / object function scripting languages: Python, C#, Scala, etc. and Good working knowledge of relational SQL and NoSQL databases.
- Experience in maintaining and developing software in production utilizing cloud-based tooling (AWS, Docker & Kubernetes, Okta, Terraform).
- Agile experience highly desirable and Experience in Snowflake, Data bricks will be a big plus.
Responsibilities:
- Build new data acquisition and transformation pipelines using big data and cloud technologies.
- Work with the broader technology team, including information architecture and data fabric teams to align pipelines with the lodestone initiative
More Information
- City Bucharest
- Currency Romania Lei – ROL
- Number of Vacancies 1
- Career Level Mid-Career
- Years of experience (Min) 4
- Years of experience End ( Max) 6
- Education Major BS/BE
- Degree Bachelor's degree / higher diploma
- Preferred Nationality Any Nationality
- Gender Male
- Speciality 1 Data Engineering
Email Me Jobs Like These
Showing 1–0 of 0 jobs