Quick Apply

S & P Global, Islamabad  Req ID: 301504                                   Qualification:    

  • BS in Computer Science or Engineering                   

Experience:

  • At least 4-6 years of professional software work experience and experience with Big Data platforms such as Google Cloud Platform, Apache Hadoop and Apache Spark.
  • Deep understanding of REST, good API design, and OOP principles and Experience with object-oriented / object function scripting languages: Python, C#, Scala, etc. and Good working knowledge of relational SQL and NoSQL databases.
  • Experience in maintaining and developing software in production utilizing cloud-based tooling (AWS, Docker & Kubernetes, Okta, Terraform).
  • Agile experience highly desirable and Experience in Snowflake, Data bricks will be a big plus.                           

Responsibilities:

  • Build new data acquisition and transformation pipelines using big data and cloud technologies.
  • Work with the broader technology team, including information architecture and data fabric teams to align pipelines with the lodestone initiative

More Information

Apply for this job
Email Me Jobs Like These
Showing 1–0 of 0 jobs

We are here to assist you by providing the best tools and platform you need to land the IDEAL job you deserve. We have a great team of certified HR Professionals, Career Development Experts.. Read More