Middle Manual QA (FMC)
Required experience: 1–3 years
Full-time, regular hours
Almaty
Responsibilities include:
  • Participating in the creation of a data integration environment and pipeline (data conveyors) using tools that provide federated access and aggregate data from different sources
  • Participating in the development of ETL/ELT packages for automating data migration between accounting systems and DWH (Data Lake)
  • Preparing data for publication in marts/information panels (dashboards)/reports
  • Optimizing processes/procedures for collecting and preparing data due to continuous growth in processed data volumes
  • Participating in data preparation and collection for external consumers (scoring/verification systems)
  • Ensuring continuity of data preparation and loading process, timely error resolution, and recovery of interrupted processes;
  • Ensuring synchronization and correctness of the sequence of data loading processes in corporate data warehouses (Oracle&ClickHouse);
  • Participating in organizing testing processes (load/module) for data streams;
  • Maintaining documentation on sources and data streams up to date.
Required qualifications include:
  • Experience with industrial client-server DBMS/NoSQL (Oracle, Clickhouse, PostgreSQL, Redis, MongoDB, etc.)
  • Experience with industrial ETL/ELT tools (Power Centre Informatica / Oracle Data Integrator / Talend, etc.)
  • Advanced level experience and skills in using SQL, PL/SQL;
  • Experience in building data warehouses and multidimensional cubes (OLAP technologies);
  • Knowledge of data models: Data Vault, Star scheme, Snowflake scheme;
  • Experience with Business Intelligence platforms (Oracle BI, Power BI, QlikView, Tableau);
  • Understanding of protocols – HTTP(S), LDAP, SMTP, (S)FTP, SOAP; Architecture: REST, SOA, API; Data structure: JSON, XML, WSDL, XSD;
  • Experience with version control systems (SVN, Git); Jira / KM – Confluence.
Additional/preferred qualifications:
  • Experience with big data processing technologies: Hadoop, Spark, Kafka;
  • Experience with Apache Hive, Impala;
  • Experience in building distributed data storage and processing systems;
  • Experience with Airflow, and NiFi.
Perks and Benefits include:
  • Hybrid work format and flexible start of the workday;
  • Professional development: participation in meetups, trainings, conferences, demo days, and hackathons;
  • Additional 4 days of vacation — a total of 28 calendar days per year;
  • Annual bonus based on KPI achievement;
  • Corporate mobile communication;
  • Voluntary medical insurance with dental and medication coverage, with the option to add two close relatives for free;
  • Fitness subscriptions with discounts and installment plans;
  • Monetary compensation for using a personal laptop;
  • Online platform for free psychological assistance
  • Access to the corporate online library MyBook.
Join us!
Send your resume.
change
subscribe to Tele2 space, your territory of opportunities is already here!
We will review your application soon,
but in the meantime...
The job you dreamed of is already here
We will review your application soon,
but in the meantime...