The primary tools and technologies for this role include Databricks, Azure Data Factory, Synapse, and Delta tables for data processing and management. Additionally, proficiency in Python is required, while knowledge in Scala is optional but definitely a plus. An understanding of ETL processing is crucial to effectively manage data workflows. Familiarity with other tools like SQL, Hadoop, Spark, or other cloud services could also be beneficial. The candidate needs to have a deep understanding of data structures and algorithms along with strong problem-solving skills. Alongside technical prowess, the candidate should exhibit strong teamwork skills, be responsive to dynamic project requirements and possess excellent communication skills to present complex information effectively.

Please enable JavaScript in your browser to complete this form.
The position you want to apply to
A linkedin profile so we can contact you
Scroll to Top