DevOps & Data Engineer job at Private International Tech Company
Website :
40 Days Ago
Linkedid Twitter Share on facebook

Vacancy title:
DevOps & Data Engineer

[ Type: FULL TIME , Industry: Business Management and Administration , Category: Science & Engineering ]

Jobs at:

Private International Tech Company

Deadline of this Job:
Tuesday, November 26 2024 

Duty Station:
Within Zambia , Lusaka, South - Central Africa

Summary
Date Posted: Tuesday, November 12 2024, Base Salary: Not Disclosed

Similar Jobs in Zambia
Learn more about Private International Tech Company
Private International Tech Company jobs in Zambia

JOB DETAILS:
We are looking for a highly skilled Data Analyst & Engineer with DevOps expertise to join our team. This hybrid role requires a unique combination of data analysis, engineering, and DevOps skills. The ideal candidate will have experience analyzing complex datasets, designing and maintaining data pipelines, and managing cloud infrastructure in support of data-driven initiatives. This role plays a crucial part in ensuring the efficient and secure flow of data from collection to actionable insights.

Key Responsibilities:
Data Analysis & Insights:
Analyze large datasets to uncover insights and trends that inform business decisions.
Build, maintain, and improve data models for various business processes.
Develop reports, dashboards, and visualizations to present data insights using tools like Power BI, Tableau, MixPanel, Kibana or similar.
Collaborate with cross-functional teams to understand data needs and provide data-driven solutions.

Data Engineering:
Design, implement, and maintain scalable data pipelines to ensure efficient data flow across various systems.
Ensure data quality and consistency by developing robust ETL, DMS processes.
Work with databases, data warehouses, and data lakes (e.g., AWS Aurora, Snowflake) to store, retrieve, and process data.
Automate data workflows, data integration, and processing tasks.

DevOps for Data:
Manage and monitor cloud infrastructure (AWS, Azure, or GCP) for data solutions, ensuring high availability and scalability.
Build and maintain CI/CD pipelines for data operations, ensuring rapid, error-free deployments.
Implement automation tools for data pipeline orchestration.
Ensure security best practices are followed, including data encryption and access management.

Collaboration & Communication:
Work closely with data scientists, software engineers, product managers, and business stakeholders to align on data requirements and objectives.
Act as a bridge between the data and DevOps teams, ensuring smooth integration of data solutions into the broader technology infrastructure.

Qualifications:
Education: Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, Information Technology, or a related field.
Experience:
Proven experience as a Data Analyst, Data Engineer, or in a similar role with a solid understanding of data architecture and infrastructure.
Hands-on experience with DevOps tools and practices, especially in cloud environments (AWS, GCP, or Azure).
Experience working with ETL/ELT pipelines, data warehousing, and big data processing frameworks
Strong SQL skills and experience working with relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB).
Experience with CI/CD pipelines, version control (Git), and infrastructure as code (e.g., Terraform, CloudFormation).
Familiarity with containerization and orchestration tools like Docker and Kubernetes.

Technical Skills:
Proficiency with Python, SQL, and data processing frameworks (e.g., Apache Spark, Pandas).
Familiarity with data visualization tools like Power BI, Tableau, or similar.
Experience with cloud services such as AWS), GCP (BigQuery, Cloud Storage), or Azure.
Knowledge of automation tools (e.g., Airflow, Jenkins) and monitoring/logging tools (e.g., Prometheus, Grafana).

Soft Skills:
Excellent problem-solving skills and attention to detail.
Strong communication and collaboration skills to work with cross-functional teams.
Ability to work in a fast-paced, agile environment.

Preferred Qualifications:
Experience with machine learning pipelines and integrating them into production environments.
Familiarity with big data technologies such as Kafka, Hadoop, or Spark.
Knowledge of security best practices in data engineering and cloud-based DevOps environments.
Certifications in cloud platforms (AWS Certified Data Analytics,or Google Cloud Professional Data Engineer).

Benefits:
Competitive salary and benefits package.
Opportunity to work with cutting-edge technologies in data analytics and DevOps.
Growth opportunities within a collaborative and innovative work environment.
Flexible working hours and remote work options.

Job Experience: No Requirements

Work Hours: 8


Experience in Months:

Level of Education:
Bachelor Degree

Job application procedure
Interested in applying for this job? Click here to submit your application now.

All Jobs

QUICK ALERT SUBSCRIPTION

Job Info
Job Category: Computer/ IT jobs in Zambia
Job Type: Full-time
Deadline of this Job: Tuesday, November 26 2024
Duty Station: Lusaka
Posted: 12-11-2024
No of Jobs: 1
Start Publishing: 12-11-2024
Stop Publishing (Put date of 2030): 12-11-2066
Apply Now
Notification Board

Join a Focused Community on job search to uncover both advertised and non-advertised jobs that you may not be aware of. A jobs WhatsApp Group Community can ensure that you know the opportunities happening around you and a jobs Facebook Group Community provides an opportunity to discuss with employers who need to fill urgent position. Click the links to join. You can view previously sent Email Alerts here incase you missed them and Subscribe so that you never miss out.

Caution: Never Pay Money in a Recruitment Process.

Some smart scams can trick you into paying for Psychometric Tests.