Summary:
Tredence Inc. is seeking a highly skilled Azure DBX Engineer to join our team. As a Senior Databricks Engineer / Databricks Technical Lead/ Data Architect, you will be responsible for developing modern data warehouse solutions using Databricks and AWS/Azure stack. Your expertise in data engineering and analytics will play a significant role in delivering powerful insights into profitable actions. If you are passionate about unlocking the full potential of data and thrive in a dynamic environment, we would love to hear from you.
Details:
About Tredence:
Tredence focuses on last-mile delivery of powerful insights into profitable actions by uniting its strengths in business analytics, data science, and software engineering. With clients in the US, Canada, Europe, and South East Asia, we are at the forefront of helping the largest companies across industries deploy their prediction and optimization solutions at scale.
Primary Roles and Responsibilities:
- Develop modern data warehouse solutions using Databricks and AWS/Azure stack.
- Provide forward-thinking solutions in the data engineering and analytics space.
- Collaborate with DW/BI leads to understand new ETL pipeline development requirements.
- Triage issues to find gaps in existing pipelines and fix them.
- Develop data models to fulfill reporting needs in collaboration with the business.
- Assist team members in resolving technical challenges and issues.
- Drive technical discussions with client architects and team members.
- Orchestrate data pipelines in scheduler via Airflow.
Skills and Qualifications:
- Bachelor's and/or master’s degree in computer science or equivalent experience.
- 6+ years of IT experience and minimum 3 years of experience in Data warehouse/ETL projects.
- Deep understanding of Star and Snowflake dimensional modeling.
- Strong knowledge of Data Management principles.
- Experience with Databricks Data & AI platform and Databricks Delta Lake Architecture.
- Proficiency in SQL, Python, and Spark (PySpark).
- Hands-on experience with AWS/Azure stack.
- Desirable to have experience with batch and streaming ETL using Kinesis.
- Experience in building ETL/data warehouse transformation processes.
- Familiarity with Apache Kafka for streaming data.
- Experience with other big data technologies like Hadoop, Hive, Pig, Impala.
- Familiarity with non-relational/NoSQL data repositories like MongoDB, Cassandra, Neo4J.
- Experience working with structured and unstructured data, including imaging and geospatial data.
- Experience in a DevOps environment with tools like Terraform, CircleCI, GIT.
- Databricks Certified Data Engineer Associate/Professional Certification (Desirable).
- Strong verbal and written communication skills.
- Strong analytical and problem-solving skills with attention to detail.
- Comfortable working in a dynamic, fast-paced, innovative environment with multiple ongoing concurrent projects.
- Experience working in Agile methodology.
We offer competitive compensation and a stimulating work environment where you can grow your skills and make a significant impact. Join our team and be part of our mission to deliver powerful insights and drive profitable actions through data analytics.
For more information about Tredence and to apply for this position, please visit our website.
अपने नौकरी आवेदन को बेहतर बनाएं
हमारे AI को आपके कवर लेटर को सावधानीपूर्वक तैयार करने और आपके रिज्यूम को इस नौकरी की विशिष्ट आवश्यकताओं को पूरा करने के लिए संवारने दें।
हमारी AI सेवाओं का उपयोग करना इस उद्देश्य के लिए हमारे AI सहयोगी के साथ अपनी प्रोफ़ाइल साझा करने के लिए आपकी सहमति को दर्शाता है।
नौकरी अलर्ट की सदस्यता लें
- सूचना प्रौद्योगिकी
नौकरी अलर्ट बनाकर, आप हमारे नियमों से सहमत होते हैं। आप किसी भी समय अपनी सहमति की सेटिंग्स बदल सकते हैं, सदस्यता रद्द करके या हमारे नियमों में विस्तृत रूप से।