Data Pipeline Engineer
We are currently looking for skilled Data Pipeline Engineer with a strong background in Python to design and optimize data processing pipelines. You’ll play a key role in handling large volumes of data, ensuring efficiency, reliability, and scalability. If you enjoy problem-solving, have a passion for building things, and thrive in a collaborative environment, this role is for you!
Our client’s platform empowers litigators and investigators to analyze electronic evidence efficiently and intuitively. By compiling, processing, and presenting legal data through a web interface, it helps professionals uncover the narrative of who did what, with whom, and when.
- 5+ years of experience with a strong focus on Python for data processing.
- Proficiency in AWS, with strong knowledge of cloud-based data infrastructure.
- Experience working with PostgreSQL Aurora and familiarity with other database systems.
- Excellent written and spoken English communication skills.
- Curiosity and a proactive approach to problem-solving.
- A passion for building things and optimizing workflows.
Position – how you’ll contribute
- Design and implement efficient data ingestion pipelines using Python.
- Create reusable scripts for processing various data formats.
- Optimize existing data loading processes for better performance.
- Develop and maintain a robust testing suite for data pipelines.
- Implement data validation and error-handling mechanisms.
- Collaborate with team members to understand new data requirements.
- Document code and create technical specifications.
Our Benefits
- Educational resources
- Flexible schedule and Work From Anywhere
- Referral Program
- Supportive and chill atmosphere
- Trajectory recognition plan
We are accepting applications from LATAM countries
Position at: Software Mind LATAM
#LI-DNI