<< Back to Careers

Lead Data Engineer

Cape Town


Yoco is a FinTech company. We focus on building tools and services to help small and micro businesses get paid, run better, and grow. We believe in unlocking entrepreneurial potential anywhere, in empowering people to thrive and in driving our economy forward.  
 
Yoco was founded in 2013. Today we are a company of over 150 people all driven by a common goal: keeping the needs of our customers at heart.  Our vision is Open Commerce and we want to do this to enable people to thrive.
 
If you are passionate about building and maintaining a world class, scalable data pipeline and warehouse then this is the role for you! Your work will be used to power the Yoco platform: by leveraging the rich data available, the Yoco Data Team is able to add real value to our customers and drive new product development, both internally and externally.
 
The requirement of being equally comfortable with managing databases and building out systems requires a dynamic individual capable of acquiring new skills at a rapid rate. You will be challenged constantly and required to find creative solutions to complex problems whilst leaning on recent developments to cut out the majority of manual and repetitive work. Learning from industry proven best practices will provide a path to success - you should be familiar with the trends in the industry and their impact to any Data Team.
 
 
ROLE | WHAT YOU WILL BE DOING?
 
Key Responsibilities:
 
  • Maintain and scale our cloud based data warehouse to continually process immense amounts of high quality data with micro-second response times for reporting, analytics and customer applications
  • Build and maintain data pipelines from cloud based relational databases as well as in the future event-streams and non-relational databases
  • Manage and maintain cloud service integrations that perform key data functions as well as integrate disparate data sources to create a single consolidated view
  • Continually plan, document and execute system expansion and architecture recommendations to support the organisations growing data and analytical needs
  • Work cross-functionally and collaborate with various stakeholders to ensure their data needs are being met in a timely fashion
  • Continue to establish a DataOps feel within the team and ensure data engineering best practices are closely followed at all times
  • Perform technical interviews, mentor team members and continually drive innovation within the team by nurturing a willingness to experiment and tackle complex problems together
  • Deliver high quality, maintainable and performant code as well as constantly improve and teach standards for best practices
  • Scope the objectives and long-range goals of your team to align with the greater organisational  key objectives and deliver on these in a timely fashion
 
 
IDEAL CANDIDATE | WHAT WE ARE LOOKING FOR?
 
Key requirements to perform responsibilities:
 
  • 5+ years experience in a similar role deploying production quality code (e.g. software engineer, data infrastructure engineer)
  • Have 3+ years experience leading a team of expert Data or Software Engineers
  • Are transparent, articulate and driven to succeed
  • Professional experience using Python for data processing
  • Deep understanding of SQL, PostgreSQL and analytical data warehouses (Redshift preferred)
  • Experience deploying applications to cloud environments and familiarity with CI/CD tooling
  • Production experience in using Apache-Airflow or other data pipeline tools
  • Hands-on experience implementing ETL (or ELT) best practices at scale
  • Experience with 3rd party data sources and consuming data from SaaS application APIs
  • Improve, manage, and teach standards for code maintainability and performance in code submitted and reviewed
  • Committed to continually keeping up with the advancements in data engineering
  • Great communicator able to collaborate with various stakeholders throughout the organisation
 
Bonus points/nice to have:
 
  • Computer science degree or equivalent
  • Have working knowledge of Terraform and Kubernetes
  • Familiarity with the structure of data required for reporting and data science projects
  • Amazon Web (Data) Services experience (Redshift, Spectrum, S3, Athena)
  • Scala and/or JavaScript and/or Swift and/or React
 
 
THE YOCO FORMULA - HOW WE CREATE VALUE 
 
The Yoco formula is a validated approach to work and a set of behaviours that create maximum value for our customers and help us grow.
 
Core Values - Our way of working to create value & grow 
 
  • Stay Connected
  • Make Space to Explore
  • Keep it Simple
  • Master your Craft
 
Leadership Principles - How we show up, engage & treat each other
 
  • Get to know each other personally
  • Say what you think and challenge me directly
  • Be courageous and focus forward
  • Don’t let ego get in the way
 
To support this, we have built a role-based organisation where every individual is given the space to focus and develop their innate strengths. Everyone at Yoco has the opportunity to lead a project and become a specialist, enabling flexibility, collaboration and accountability at all levels. You will be working with a diverse, motivated and skilled team who will continuously stretch you as an individual. To learn more about our culture, go to our #YocoLife page or subscribe to our Exposure Gallery
 
Join us on a meaningful journey at Yoco, and help enable our merchants to thrive through Open Commerce!
 
Please note… 
Yoco encourages applicants from diverse backgrounds to apply. Open positions at Yoco are competitive and we often receive high volumes of applicants. If you have not received further updates on your application after three weeks, you’re welcome to request feedback.
 
 

Every startup has a story, this is the Yoco story as told by Photographer and Visual Anthropologist, Gregor Röhrig.

Visit Yoco's Exposure Gallery