Data Engineer at Co-operative Bank of Kenya

3 Years
or Register to apply for this job
Company Details
Industry: Banking
Description: The Co-operative Bank of Kenya Limited is incorporated in Kenya under the Company Act and is also licensed to do the business of banking under the Banking Act. The Bank was initially registered under the Co-operative Societies Act at the point of founding in 1965. This status was retained up to and until June 27th 2008 when the Bank's Special General Meeting resolved to incorporate under the Companies Act with a view to complying with the requirements for listing on the Nairobi Stock Exchange (NSE)
Job Description

The Role 

Specifically, the successful jobholder will be required to:

  • Gather information from business users to understand their detailed requirements and expectations, analyze business/use case requirements from BI analysts to determine operational problems, define data modelling requirements and develop data structures to support the generation of business insights and strategy.
  • Carry out analysis of requirements and recommend solutions to address user requirements. 
  • Assist in preparing system definition/specification by the users highlighting technical requirements and roll out BI Solutions to stakeholders.
  • Identify, analyze and interpret trends or patterns in complex data sets using statistical techniques and provide reports. 
  • Creation, scheduling, testing, deployment, and maintenance, of data pipeline from different source to required destination with the required transformations for reporting (ETL).
  • Design, build, and optimize data ingestion pipelines from structured and unstructured sources
  • Ensure data quality, lineage, and governance through automated checks and metadata management.
  • Implement CI/CD for data pipelines including automated testing, version control, and rollbacks.
  • Create reusable pipeline components and templates to accelerate onboarding of new data sources.
  • Develop and maintain data models, warehouse layer, lakehouse zones. 
  • Build and automate end-to-end ML pipelines integrating training, validation, deployment, and monitoring.
  • Create feature pipelines, model training pipelines, and batch/real-time prediction services.
  • Manage ML model versioning, metadata tracking, and reproducibility
  • Build visualizations to summarize and make presentations to business and other key stakeholders. Filter, clean data and review reports, print outs and performance indicators to locate and correct code problems.
  • Secure BI solutions by putting adequate controls and restrict access to programs by users in accordance to the requirements of the Bank.
  • Guide the business in drawing report formats & wireframes and advice on the best approach to transform data and automate reports as well as Design and code reports/dashboards according to user specification with the key objective to deliver reports that will assist in decision-making and control.
  • Develop and maintain documentation/manuals on system configuration or set up, carry out technical user training as required to enable users interpret BI reports as well as deal with data, dashboards and report queries from users and resolve or advise them accordingly.

Skills, Competencies and Experience

The successful candidate will be required to have the following skills and competencies:

  • Bachelor of Science degree in Computer Science, IT, Software Engineering, or any other degree in related fields.
  • A minimum of 3 years’ experience in Data engineering, BI & Software Development using Oracle.
  • Strong knowledge and experience with ETL tools (Oracle ODI, Microsoft SSIS,Talend), query languages (Oracle PL/SQL, SQL), programming languages (Java, Python, Scala)
  • Experience with Dimensional data modeling, data management and data processing. Knowledge of statistics and experience using statistical packages in analyzing large data sets (Python,R, SPS, SAS, Excel etc.) 
  • Experience in CI/CD and automation tools (GitLab CI, Jenkins, Argo, etc.).
  • Experience with Big data tools (Hadoop, Apache Hive, Scala, Kafka, Apache Spark, NoSQL da) 
  • Knowledge of visualization tools (Oracle Analytics Server, Power BI, SSRS, Tableau, click view)
  • Technical expertise regarding data models, database design development, data mining and segmentation techniques is desired.
  • Very good knowledge of Windows Operating Systems and a fair knowledge of Unix & Linux.
Salary: Discuss During Interview
Education: Diploma
Employment Type: Full Time

Key Skills

data  Information Technology 
Beware of Fraudsters!
Never pay anyone for job applications, interview tests, or job interviews. A genuine employer will never ask you for payment under any circumstances.
Disclaimer & TOS: We do not guarantee the authenticity of every single job posting and are not responsible for any fraudulent activity or misrepresentation by third parties. We are not involved in any stage of the interview or recruitment process and do not charge any fees from job seekers. For further details, please read the rest of the Terms of Service.

Recent Jobs