About Logically
Founded in 2017, Logically combines artificial intelligence with expert analysts to tackle harmful and manipulative content at speed and scale. We work to reduce the individual, institutional, and societal damage caused by misleading and deceptive online discourse. In 2021, the company launched its threat intelligence platform - Logically Intelligence - to help governments and organisations monitor the spread of damaging narratives.
In 2023, the company announced its new independent fact-checking unit - Logically Facts - to help deliver accurate fact-checks at scale and provide people with more reliable and accurate information.
Logically has been named one of the world’s most innovative artificial intelligence companies by Fast Company and won the Rising Star in TechCogX Award.
The company has offices in the UK, US, and India. For more information, please visit Logically.ai
We are seeking a talented and experienced Data Engineer to take a lead role in designing, building and supporting our data acquisition pipeline. You will play a critical role in ensuring our customers have the data they need at the right time to meet their outcomes.
As a senior engineer and leader, you will lead and mentor other data engineers from across the organisation driving best practice in all elements of data engineering. You will work collaboratively with other senior engineers to ensure Logically is delivering world-class solutions.
Data quality, privacy, compliance and governance are of paramount importance to us and you will be well experienced in the delivery of solutions that drive up our standards.
Key Responsibilities:
1. Data Engineering:
Design, develop and maintain processing architecture for efficiently ingesting data from diverse sources, including databases, APIs, files, and streaming platforms.
Implement strategies for real-time and batch data ingestion, ensuring scalability and performance.
Evaluate and select appropriate data ingestion tools and technologies based on product requirements and constraints.
Actively assists and shares knowledge and understanding in order to make better product decisions and data ingestion decisions.
Communicates to wider business and stakeholders around highly technical data risks, issues, blockers and notices these and acts proactively around them.
Collaborate closely with cross-functional teams to gather requirements, design data pipelines, and maintain data quality standards.
2. Leadership:
Take the lead in driving innovation, experimenting and adopting new technologies in line with our commercial goals
Coach, mentor and develop junior engineers to increase knowledge and talent across the organisation.
Drive best practice in data quality, privacy, governance and compliance solutions and implementations
3. Performance Optimisation:
Monitor and optimise the performance of data ingestion and integration processes, identifying bottlenecks and implementing solutions for scalability and efficiency.
Conduct performance tuning and troubleshooting to resolve issues related to data processing and system performance.
Define, create, gather and monitor KPIs with regards to data ingestion, integrity, performance and track these metrics in a way that helps benefit the wider product
4. Documentation and Knowledge Sharing:
Document data ingestion and integration processes, including data lineage, transformation logic, and system dependencies.
Provide training and support to internal stakeholders on data integration best practices, tools, and technologies.
Collaborate with teams to create and maintain documentation related to data models, schemas etc.
Your experience:
10+ years of experience in a Data Engineer role as an individual contributor
You have at least 2 years of experience as a tech lead for a Data Engineering team
You are an engineer with a track record of driving and delivering large and complex efforts across distributed teams
You are a great communicator and maintain many of the essential cross-team and cross-functional relationships necessary for the team's success.
Experience with building streaming pipelines
Experience working with varied forms of data sources and handling a variety of data formats and schemas
Experience building scalable data pipelines
Experience with a variety of Programming languages, such as Python, Javascript, Typescript.
Experience working with technologies like GCP data services (Pub/Sub, Dataflow etc), Databricks or similar Apache projects (Spark, Flink, Hive, or Kafka)
Experience with CI/CD practices and implementation
Understanding of Data Engineering tools/frameworks and standards to improve the productivity and quality of output for Data Engineers across the team.
Benefits
- Unlimited Holidays!
- Company Performance related Bonus
- Private Medical
- A remote-first working location policy
- Ability to work from anywhere in the world for up to 8 weeks of the year
- Pension
- £250 work from home package to improve your home working environment
- Mental Health Aid
- L&D Perks
- Quarterly Social events
Thank you for your interest in joining Logically.