Data Pipeline Engineer
Location: Bologna, Italy
Imagine working at a company where you create products that enrich people’s lives. With your passion and dedication, there is no telling what we could accomplish together! LHP Europe’s products and services make it possible for our customers to address their environmental impact, produce better business outcomes, and accelerate digital transformation. Our goal is to solve business challenges and deliver intelligent solutions that make a difference for their customers and employees. We aspire to make a positive impact in the community.
We are investing in the individuals whose minds are constantly striving to learn and think. Individuals with an entrepreneurial spirit, a deep passion and drive to solve high-value problems, strong integrity who are self-driven and who can lead with empathy. If this is you, you will have opportunities with LHP Europe to lead the architecture, design, development, delivery, and support of products that our customers’ will use as an integrated tool to manage and improve their business.
If this is you, you will have opportunities with LHP Europe to lead the architecture, design, development, delivery and support of products that our Customers’ will use as an integrated tool to manage and improve their business.
Being part of the Development Team, your activities will include:
- Writing data pipelines in Python or R to extract data from blockchain & service APIs into relational databases
- Developing tools to monitor existing data pipelines and web scrapers
- Improving the data quality of the databases by working on hotfixes with the help of QA & fellow data engineers/data scientists
- Deploying machine learning models on a secured, public-facing API with Kubernetes
- Writing and securing public-facing APIs on Kubernetes to allow external partners to access our services
- Build datasets, pipelines, and training models
- Monitoring load on databases and suggesting ways to optimize SQL queries
- Develop scalable and robust software applications to ingest, consume and visualize data for live operations and reporting
- Mentor and develop junior members of the team
- Build Software Apps to interface with other systems and data preparation jobs, containerization, and cloud deployment
- Run complex SQL queries and build data models to assist the op’s teams
- Conduct data exploration and build data ingests utilizing APIs and other methods
- Build, maintain, and update visualization dashboards and tools to monitor data health
- Perform quality assurance and routine diagnostics on data and data feeds
- Develop and maintain protocols to systematize processes for future work
Requirements:
- Master’s Degree in technical discipline (Computer Science, Computer Engineering or similar)
- 3-5 years of hands-on experience, preferable working in cross-functional teams with both technical and non-technical stakeholders
- ELT or ETL pipeline building experience
- Software Development experience
- Adapt with SQL, data modelling, and complex database joins
- Strong troubleshooting skills with a passion for solving numerical and technical problems
- Extremely detail orientated
- Adaptability, technologically and as per business needs
Additional Skills:
- Outstanding written and verbal communication skills
- Strong attention to detail
- Excellent spoken and written English and Italian are mandatory
Ability to work with these technologies:
- Linux and Windows-based systems
- OS: Linux
- Strong knowledge of Java, Python, and SQL (plus knowledge of JavaScript, HTML, CSS preferred)
- Python dev. experience is essential for building custom data pipelines
- Knowledgeable with REST APIs, HTTP, XML, CSV, JSON, and GIT
What we offer:
- A supportive and ambitious work culture
- An international team
- Possibilities to grow
- A flexible working environment