About the Job
Hey! Nice to see you!
Let’s share our stories to get to know a bit better…. We are business and technology enthusiasts that are constantly hungry for new challenges, self- and others-development and nothing motivates us more than great software products and happy customers.
At Incubly, we believe that great people want to work with great people, so we started to build such a company that will attract great minds and that we can achieve everything not feeling that we actually work, but just having fun….
Our mission is to support tech companies, and startups (scaleups) mainly, in fast and high-quality scale-up of their teams and boost their product development, testing and deployment; so we can succeed together.
We are currently working with a US company that is providing Smart City innovative solutions with the mission to drive the world to be safer, smarter, and more efficient. A platform that we are working on combines IoT technology, connected vehicle telemetry data, computer vision data and machine learning. Products and services provided by our platform enable the future of intelligent: Public Safety, Urban Mobility and Traffic Management around the world. That is why we process tens of terabytes of data and billions of streaming events daily. These numbers are growing rapidly as the business expands into new regions and integrates with new sources of data.
We're looking for an experienced Senior Data Engineer who cares deeply about their craft, and who wants to use their skills to bring positive change in the world and help us mature into a high-performing global enterprise. If you like to work with us, let us share our need for competencies.
Your daily responsibilities
Spread data culture and practices, enabling the Team to build better, more reliable, automated and secure data products.
Design and implement streaming data pipelines and batch data pipelines.
Take care of ingesting, profiling, extracting, cleaning, transforming, normalizing, organizing, loading and documenting diverse portfolios of data assets.
Implement data quality and integrity monitoring to proactively identify and fix data issues.
Design and implement solutions to support data lake and data warehousing.
Work across diverse teams in our organization to ensure that the data infrastructure is capable of meeting solution requirements.
Collaboratively determine how best to manage data under business, technical, compliance, privacy and ethical constraints.
Work with other data SMEs and stakeholders to ensure data assets are FAIR (Findable, Accessible, Interoperable, Reusable).
Contribute to an open, creative and collaborative culture where everyone feels accountable for shaping the future.
We need you to have
Data engineering and data infrastructure experience
Experience leveraging full-stack AWS Data and Analytics ecosystem
Proficiency in SQL, Python and/or Scala.
Experience with Big Data and streaming technologies (Hadoop, EMR, Spark, Flink, Kafka/Kinesis, Firehose)
Experience working with data workflow (ETL) tools and platforms (Glue, StreamSets, Airflow, Matillion, Informatica, Talend etc.).
Experience with a variety of database, warehouse and data lake technologies (PostgreSQL, MySQL, S3/Athena, Redshift, Snowflake etc.)
Experience with NoSQL Database technologies (DynamoDB, Redis, Elastic, DocumentDB etc)
Comfortable working in an Agile environment with a cross functional and globally distributed team including offshore teams with the ability to manage multiple projects and priorities
High motivation and being comfortable in the rapidly changing nature of a startup environment.
Would be great if you have
AWS Certification (Preferred)
Experience working with environments under security and compliance regulations.
Experience with containerization and container orchestration technologies, like Docker, Kubernetes, and Argo
Version control system (BitBucket, GitHub, GitLab)
Ability to move relentlessly forward amidst uncertainty and raise hand to do what’s needed.
Our Architecture and Technology Stack
AWS: Kinesis, Glue ETL, EMR, Redshift, Athena, Sagemaker, EventBridge, EKS, RDS Aurora, PostgreSQL
AWS S3 + Iceberg + Paruqet
GitHub & GitHub Actions
Python, Java, Scala, Kotlin
In addition to great company and challenging projects, we can offer much much more, i.e.
knowledge sharing within our company
budget for training and development (conferences, training, certifications etc.)
agile and friendly atmosphere, non-violent communication and full respect for diversity
possibility to choose from onsite (in Lodz, Poland) or hybrid work (one day a week in our office in Łódź city center)
integration budget – to get to know better with your colleagues and client
B2B or employment contract
remuneration on B2B contract: 1.200 - 1.500 PLN net/day
remuneration on employment agreement: 20.000 - 25.000 per month (gross)
possibility to engage not only technically, but also have an impact on the small company culture
I hereby give consent for my personal data included in my application to be processed by Incubly Sp. z o.o. for the purposes of the recruitment process under the European Parliament's and Council of the European Union Regulation on the Protection of Natural Persons as of 27 April 2016, with regard to the processing of personal data and the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation).