DataSpark-DevOps Engineer #TeSA DataSpark-DevOps Engineer #TeSA …

Singtel
in Singapore
Permanent, Full time
Be the first to apply
Competitive
Singtel
in Singapore
Permanent, Full time
Be the first to apply
Competitive
DataSpark-DevOps Engineer #TeSA
DataSpark was created from a vision to transform Singtel's rich and unique repository of data into business value and social impact. Our data products and services provide powerful insights and advanced analytics capabilities to businesses, government agencies, and other telecommunication companies. We strive for our analytics to be trustworthy and relevant to our clients while adhering to high standards of data privacy.

At DataSpark, you get to work with rich and diverse datasets, cutting edge technology, and you get to see the impact of your results in real business and government decisions, which in turn provide positive social benefit for consumers at a large scale. As a startup that is part of Singtel, DataSpark provides an enviable work environment with spirited trailblazing and industrial countenance. Working alongside creative, energetic and passionate teammates from around the world, you get to be a part of our exciting growth journey as we build the company to the next level.

This job role is supported by the structured development programme for mid-career professionals under the TechSkills Accelerator (TeSA) initiative, led by Infocomm Media Development Authority (IMDA) in partnership with SkillsFuture Singapore (SSG) and Workforce Singapore (WSG).

The programme aims to provide Singaporeans aged 40 and above with a tech-related job while being reskilled or upskilled. Mid-career professionals will have the assurance of being employed in a paid job while attending structured Company-Led Training for in-demand tech skills. They will also gain experience in a tech role to tap into the good careers that the infocomm sector offers.

For more information on the programme, please visit www.imtalent.sg/TeSAMidCareer

Responsibilities

  • Define, scope, size, implement, test, and deploy existing and new infrastructure for both clients and internal teams that processes hundreds of terabytes each day and growing
  • Develop, support, and improve tools for continuous integration, automated testing and release management
  • Install, configure and customize DataSpark software according to project requirements, including data ingestion, algorithms, APIs, UIs and security
  • Design, implement, operate and troubleshoot the automation and monitoring of our infrastructure in multiple environments and multiple data centers owned or rented from cloud providers
  • Serve as the subject matter export on infrastructure performance to the company as well as to our clients
  • Perform system integration tests, performance tests, technical acceptance tests, and user acceptance tests to ensure proper functioning of deployed systems
  • Troubleshoot and resolve issues in multiple environments
  • Improve our infrastructure capabilities, optimizing for cost, simplicity, and maintainability

Requirements

  • Experience building and running a mission critical service at scale, including
    • Experience in software engineering, release engineering and/or configuration management
    • Experience as a systems administrator in a Linux environment
    • Experience administrating open source big data systems and frameworks, including Hadoop, Spark, Presto
    • Experience in full stack Cloudera Hadoop administration
  • Good experience on operating Amazon AWS Cloud (EMR, EC2, VPC, VPN, EBS, S3, Route53, IAM, AWS CLI etc)
  • Experience in building and deploying CI/CD platforms like Jenkins, Artifactory, GitHub, Bamboo and capable to support applications both on-premise and AWS cloud.
  • Deep technical expertise in DevOps automation tools and scripting, i.e. Python, Ruby
  • Strong experience in open source platform, particularly in Kubernetes, Containers.
  • Experience in any of the Configuration Management and Deployment tools like - Puppet, Ansible, Chef, Terraform etc
  • Experience in logging, monitoring, tracing e.g. Cloudwatch, Elasticsearch/Kibana (ELK), Prometheus/Grafana, New Relic, Data Dog, Dynatrace etc
  • Demonstrated experience in software product life cycles, both traditional enterprise software development or agile internet data product development
  • Working knowledge of network security, web and network protocols and standards
  • Knowledge of information security issues is a plus
  • Good knowledge of monitoring systems


Singtel logo
More Jobs Like This
See more jobs
Close
Loading...
Loading...