At Rocker, technology makes up the core of our business, and we are building it with highly skilled, talented, and ambitious people. To improve our product and business decisions as we continue to scale, Rocker is searching for a Data Engineer to support our finance, marketing, customer, and product teams.
As our only Data Engineer, you will build pipelines and dashboards (dashboards and analytics are not in the scope) for our teams to leverage user insights
to ensure we continue building products customers love. You will work closely with our Product Managers and developers alike to ensure real-time information
is available and accessible so we can quickly adjust when necessary. You will work closely with our finance team to help identify and mitigate risks
, and enable our teams to leverage data when making decisions in our mission to build the best financial app on the market.
Being a successful Data Engineer, requires a strong technical foundation, understanding of business strategy, and excellent communication skills
. Your ability to manage various stakeholders will enable you to take initiatives with confidence and lead projects to actualization.What you'll do:
- Have the full responsibility to design, build and maintain our data pipeline architecture empowering our internal business intelligence, helping our business teams make the right decisions
- Making sure data from internal and external systems are available and structured in our BigQuery data warehouse
- Balancing a long term vision for the internal data warehouse, with short term inquiries. Sometimes you may have to start with something quick and ugly and improve it over time
- Work closely with a highly dedicated, undoubtedly talented, multi national, diverse, food loving, ping-pong playing team of developers, product managers and designers to make sure data is at the heart of all our services
What you'll need:
- Manage a set of constantly changing priorities in a high paced, sometimes borderline chaotic, environment
- 3+ years of experience in building and maintaining data pipelines for data hungry organisations
What will help:
- Experience with:
- Google Cloud Platform or other cloud platforms
- SQL databases, such as postgres and BigQuery
- Data pipelining tools similar to Spark, Beam, Airflow etc.
- Python is minimum and Scala is a bonus
- Knowledge of Kafka or similar message queues
- Knowledge of Kubernetes and Docker
- Experience with Google Cloud SQL & database migrations is a plus