Data Engineer

Data Engineer

Sector

Data & AI

Location

Europe - Remote

Employment Type

Full-time

 

Data Engineer

Are you ready to revolutionise the world with TEKEVER? 🚀🌍

Join us, the European leader in unmanned technology, where cutting-edge advancements meet unparalleled innovation. We offer a unique surveillance-as-a-service solution that provides real-time intelligence, enhancing maritime safety and saving lives. TEKEVER is setting new standards in intelligence services, data and AI technologies.

Become part of a dynamic team transforming maritime surveillance and making a significant impact on global safety. 🌐

At TEKEVER, our mission is to provide limitless support through mission-oriented game-changers, delivering the right information at the right time to facilitate critical decisions.

If you’re passionate about technology and eager to shape the future, TEKEVER is the place for you! 👇🏻🎯

 

Job Overview:

  • As a Data Engineer, you will play a critical role in designing, building and maintaining the data pipelines and systems that support our data-driven initiatives, as well as supporting the evolution of our Data & Analytics Platform. You will work closely with data scientists, analysts and other stakeholders to ensure that our data & AI infrastructure is robust, scalable and efficient. The ideal candidate will have a strong background in data engineering, with experience in data integration, ETL processes, database management and Data & Analytics Platform development.

 

What will be your responsibilities:

  • Data Pipeline Development: Design, develop and maintain scalable and efficient data pipelines to collect, process and store large volumes of data from various sources.
  • ETL Processes: Implement ETL (Extract, Transform, Load) processes to ensure data is accurately and efficiently transformed and loaded into data storage systems.
  • Database Management: Manage and optimize databases and data warehouses to ensure data integrity, performance and availability.
  • Data Integration: Integrate data from multiple sources, including APIs, databases and external data providers, to create unified datasets for analysis.
  • Data & Analytics Platform development & expansion: support the expansion of our Data & Analytics Platform.
  • Data Quality Assurance: Implement data validation and quality assurance processes to ensure the accuracy and consistency of data.
  • Collaboration: Work closely with data scientists, analysts and other stakeholders to understand data requirements and provide the necessary data infrastructure and support.
  • Performance Optimization: Monitor and optimize the performance of data pipelines and databases to ensure efficient data processing and retrieval.
  • Documentation: Maintain comprehensive documentation of data pipelines, ETL processes and database schemas.
  • Profile and requirements:
  • Education: Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or a related field.
  • Experience: 3+ years of experience in data engineering or a similar role.
  • Technical Skills:
  • Proficiency in programming languages such as Python, Java, or Scala.
  • Experience with SQL and database management systems (e.g., MySQL, PostgreSQL, SQL Server).
  • Familiarity with big data technologies (e.g., Hadoop, Spark) and data warehousing solutions (e.g., Redshift, Snowflake).
  • Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services, with a focus on Google Cloud. Google Cloud certification is preferred.
  • Knowledge of data integration tools and frameworks (e.g., Apache Nifi, Talend, Informatica).
  • Experience with data modeling and schema design.
  • Experience with Iaac (e.g. Ansible, Terraform), data pipeline orchestration (e.g. Airflow), log exploration tools (e.g. Streamlit, Dash), data extraction (e.g. PostGIS, Kafka, Airflow, FastAPI), pandas, scikit-learn, Docker.
  • Basic understanding of DevOps best practices and tools: GIT, CI/CD, telemetry and monitoring, etc.
  • Analytical Skills: Strong analytical and problem-solving skills with a focus on delivering scalable and efficient data solutions.
  • Communication: Excellent verbal and written communication skills, with the ability to effectively collaborate with technical and non-technical stakeholders.
  • Attention to Detail: High attention to detail and a commitment to ensuring data quality and accuracy.
  • Adaptability: Ability to work in a fast-paced, dynamic environment and manage multiple priorities simultaneously.

 

What we have to offer you:

  • An excellent work environment and an opportunity to make a difference;
  • Salary Compatible with the level of proven experience.

 

Do you want to know more about us ?

Visit our LinkedIn page at https://www.linkedin.com/company/tekever/

If the above excites you, send us your application to jobs@tekever.com! 🚀👩‍💻

Data Engineer

Sector

Data & AI

Location

Europe - Remote

Employment Type

Full-time

 

Data Engineer

Are you ready to revolutionise the world with TEKEVER? 🚀🌍

Join us, the European leader in unmanned technology, where cutting-edge advancements meet unparalleled innovation. We offer a unique surveillance-as-a-service solution that provides real-time intelligence, enhancing maritime safety and saving lives. TEKEVER is setting new standards in intelligence services, data and AI technologies.

Become part of a dynamic team transforming maritime surveillance and making a significant impact on global safety. 🌐

At TEKEVER, our mission is to provide limitless support through mission-oriented game-changers, delivering the right information at the right time to facilitate critical decisions.

If you’re passionate about technology and eager to shape the future, TEKEVER is the place for you! 👇🏻🎯

 

Job Overview:

  • As a Data Engineer, you will play a critical role in designing, building and maintaining the data pipelines and systems that support our data-driven initiatives, as well as supporting the evolution of our Data & Analytics Platform. You will work closely with data scientists, analysts and other stakeholders to ensure that our data & AI infrastructure is robust, scalable and efficient. The ideal candidate will have a strong background in data engineering, with experience in data integration, ETL processes, database management and Data & Analytics Platform development.

 

What will be your responsibilities:

  • Data Pipeline Development: Design, develop and maintain scalable and efficient data pipelines to collect, process and store large volumes of data from various sources.
  • ETL Processes: Implement ETL (Extract, Transform, Load) processes to ensure data is accurately and efficiently transformed and loaded into data storage systems.
  • Database Management: Manage and optimize databases and data warehouses to ensure data integrity, performance and availability.
  • Data Integration: Integrate data from multiple sources, including APIs, databases and external data providers, to create unified datasets for analysis.
  • Data & Analytics Platform development & expansion: support the expansion of our Data & Analytics Platform.
  • Data Quality Assurance: Implement data validation and quality assurance processes to ensure the accuracy and consistency of data.
  • Collaboration: Work closely with data scientists, analysts and other stakeholders to understand data requirements and provide the necessary data infrastructure and support.
  • Performance Optimization: Monitor and optimize the performance of data pipelines and databases to ensure efficient data processing and retrieval.
  • Documentation: Maintain comprehensive documentation of data pipelines, ETL processes and database schemas.
  • Profile and requirements:
  • Education: Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or a related field.
  • Experience: 3+ years of experience in data engineering or a similar role.
  • Technical Skills:
  • Proficiency in programming languages such as Python, Java, or Scala.
  • Experience with SQL and database management systems (e.g., MySQL, PostgreSQL, SQL Server).
  • Familiarity with big data technologies (e.g., Hadoop, Spark) and data warehousing solutions (e.g., Redshift, Snowflake).
  • Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services, with a focus on Google Cloud. Google Cloud certification is preferred.
  • Knowledge of data integration tools and frameworks (e.g., Apache Nifi, Talend, Informatica).
  • Experience with data modeling and schema design.
  • Experience with Iaac (e.g. Ansible, Terraform), data pipeline orchestration (e.g. Airflow), log exploration tools (e.g. Streamlit, Dash), data extraction (e.g. PostGIS, Kafka, Airflow, FastAPI), pandas, scikit-learn, Docker.
  • Basic understanding of DevOps best practices and tools: GIT, CI/CD, telemetry and monitoring, etc.
  • Analytical Skills: Strong analytical and problem-solving skills with a focus on delivering scalable and efficient data solutions.
  • Communication: Excellent verbal and written communication skills, with the ability to effectively collaborate with technical and non-technical stakeholders.
  • Attention to Detail: High attention to detail and a commitment to ensuring data quality and accuracy.
  • Adaptability: Ability to work in a fast-paced, dynamic environment and manage multiple priorities simultaneously.

 

What we have to offer you:

  • An excellent work environment and an opportunity to make a difference;
  • Salary Compatible with the level of proven experience.

 

Do you want to know more about us ?

Visit our LinkedIn page at https://www.linkedin.com/company/tekever/

If the above excites you, send us your application to jobs@tekever.com! 🚀👩‍💻