Komputerləşmə və İKT

PASHA Insurance

Dataops or Data Operations Engineer

Şəhər: Bakı
İş rejimi: full-time
Son tarix: 2024-02-24

Responsibilities:

  • Monitoring Data Pipelines: Ensuring that all data pipelines are functioning correctly, and quickly addressing any issues or failures.
  • Data Quality Checks: Performing routine checks to ensure data accuracy and consistency.
  • Responding to Data Queries: Addressing urgent data requests and providing support to data users.
  • Automation Maintenance: Ensuring that automated processes and scripts are running as expected.
  • Incident Management: Responding to and resolving any operational incidents that affect data systems.
  • Performance Optimization: Reviewing system performance and optimizing data processing and storage operations.
  • Data Backup and Recovery Operations: Ensuring data is securely backed up and testing recovery procedures.
  • Data Governance and Compliance Checks: Reviewing data usage and storage against compliance standards and governance policies.
  • Report Generation: Creating and distributing regular reports on data operations, performance, and usage.
  • Infrastructure Review: Evaluating the data infrastructure for potential upgrades or improvements.
  • Data Integration: Integrating new data sources into the existing data ecosystem.
  • Disaster Recovery Execution: Implementing disaster recovery plans in case of a major incident.
  • Training and Development: Organizing or attending training sessions to stay updated with the latest data technologies and practices.

Requirements:

Experience:

  • Cross-Disciplinary Experience: Experience that spans both development (Dev) and operations (Ops), demonstrating an understanding of the entire software development lifecycle.

Educational Background:

  • Degree in Computer Science, Information Technology, or related fields: A foundation in IT and data management principles is crucial.
  • Certifications: Certifications in relevant technologies and methodologies (like Agile, DevOps, Cloud Computing) can be an advantage.

Technical Skills:

  • Programming and Scripting: Proficiency in languages such as Python (or Ruby, or Bash) scripting.
  • DevOps Tools: Experience with CI/CD tools (e.g., Jenkins, GitLab CI), configuration management tools (e.g., Ansible, Chef, Puppet), and containerization technologies (e.g., Docker, Kubernetes).
  • Database Knowledge: Understanding of database management, including SQL.
  • Monitoring and Logging Tools: Familiarity with tools like Prometheus, Grafana, ELK stack (Elasticsearch, Logstash, Kibana).
  • Cloud Computing: Experience with cloud services and cloud architecture.
  • System Administration: Basic understanding of system administration, especially in a Linux/Unix environment, can be helpful, as it supports the management of the underlying systems for data operations.