Some people look at dashboards.
Others build the pipelines that make those dashboards possible.
π₯ We are looking for engineers who care about the integrity, scalability, and performance of data systems that power real operational insight.
At Jurumani Solutions, we are building modern analytics and automation platforms for one of the leading telecommunications organisations in the country, and we are looking for a Data Pipeline Engineer who thrives on solving complex data challenges at scale.
This is not a passive reporting role.
This is not just moving data from A to B.
π‘ This is an opportunity to engineer the ingestion, transformation, and operational intelligence pipelines that support critical analytics, visibility, and business decision making.
If you enjoy building robust ETL/ELT pipelines, optimising data flows, working with the ELK stack, and solving large scale data engineering challenges, this role was built for you.
You will help engineer and operate the data backbone powering analytics and visualisation platforms across large-scale environments.
Your work will include:
β‘ Building and maintaining robust ETL and ELT pipelines
β‘ Ingesting and transforming high-volume log, event, and operational data
β‘ Developing scalable Python-based data transformation logic
β‘ Managing schemas, data evolution, and analytical structures
β‘ Performing historical reprocessing and data corrections
β‘ Supporting data reliability, observability, and platform optimisation
You will work across structured, semi-structured, and log-based datasets ranging from thousands to millions of transactions per day.
π Your work will directly impact operational visibility, reporting accuracy, platform intelligence, and business decision-making.
The ELK stack sits at the centre of this ecosystem.
You will help operate and optimise:
β
Elasticsearch
β
Logstash
β
Kibana
β
Beats
β
Index lifecycle management
β
Sharding and cluster performance tuning
β
Data ingestion pipelines
β
Query optimisation and analytical performance
You will also collaborate closely with the Data Analyst to ensure data is clean, accessible, meaningful, and analytics-ready.
You are probably the type of person who:
βοΈ Notices inefficiencies before others do
βοΈ Loves solving data and performance bottlenecks
βοΈ Gets satisfaction from clean, reliable systems
βοΈ Cares deeply about correctness and data quality
βοΈ Enjoys building things that scale properly
βοΈ Thinks proactively instead of waiting for instructions
βοΈ Wants ownership and technical trust
βοΈ Enjoys learning new technologies and improving continuously
π₯ You understand that bad data creates bad decisions.
π₯ You take pride in building reliable systems.
π₯ You enjoy engineering solutions that make complex environments simpler.
β’ ELK Stack (Elasticsearch, Logstash, Kibana)
β’ Python
β’ Pandas / Polars
β’ SQL
β’ Git / GitLab
β’ PostgreSQL
β’ MariaDB
β’ MongoDB
β’ Data orchestration and scheduling tools
β’ Bash scripting
β’ Ansible
β’ Data governance frameworks
β’ ELK alerting and anomaly detection
β’ Monitoring and observability tooling
Because here, engineers are trusted to engineer.
You will:
π Work on meaningful, large-scale data platforms
π Build systems that directly influence operational decisions
π Solve technically challenging engineering problems
π Work with a highly capable and collaborative team
π Be encouraged to improve, optimise, and innovate
π Have ownership over real engineering outcomes
π Continuously learn and grow in a fast-moving environment
We value:
π‘ Capability over politics
π‘ Curiosity over ego
π‘ Builders over spectators
π‘ Reliability over shortcuts
We care more about practical capability than perfect credentials.
If you have strong engineering instincts, production pipeline experience, and a passion for building scalable data systems, we want to hear from you.
If you want to engineer data platforms that power real operational intelligence, modern analytics, and large-scale decision making, this is your opportunity.
Bring your curiosity.
Bring your engineering mindset.
Bring your obsession with reliability and performance.
π Letβs build systems that turn raw data into real impact.