our company 5 years

9Yards Technology is more than an IT consultancy, it’s a powerhouse of innovation and talent empowerment. We’re dedicated to recognizing, nurturing, and empowering talent, giving each team member a platform to explore, grow, and contribute to cutting-edge projects. Our enviable work culture is designed to inspire creativity and collaboration, supported by state-of-the-art infrastructure and modern in-house amenities.

From the latest technological systems to a collaborative environment that promotes professional and personal growth, we offer an experience that’s both fulfilling and rewarding. Every individual here plays a crucial role in shaping solutions that drive success for our clients and the industry. At 9Yards Technology, we believe that people are our greatest asset, and we are committed to building an ecosystem where talent thrives, innovation is continuous, and every project undertakes sets new standards in excellence.

Career

Explore​ Job Opportunities​

Join our talented team of experts and make a meaningful real-world impact! Discover exciting career opportunities and become part of a dynamic culture where we collaborate to create something truly extraordinary.

Experience: 8+ Years  

Location: Pune/Gurugram/Noida/Bangalore/Hyderabad/Chennai (Hybrid)  

Key Responsibilities:  

  1. Design and develop ETL/ELT pipelines using Azure Data Factory, Snowflake, and DBT.  
  2. Build and maintain data integration workflows from various data sources to Snowflake.  
  3. Write efficient and optimized SQL queries for data extraction and transformation.  
  4. Work with stakeholders to understand business requirements and translate them into technical solutions.  
  5. Monitor, troubleshoot, and optimize data pipelines for performance and reliability.  
  6. Maintain and enforce data quality, governance, and documentation standards.  
  7. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. 

Must-Have Skills:  

  1. 8+ years of experience in data engineering roles using Azure and Snowflake, DBT.  
  2. Strong experience with Azure Cloud Platform services.  
  3. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines.  
  4. Proficiency in SQL for data analysis and transformation.  
  5. Hands-on experience with Snowflake and SnowSQL for data warehousing.  
  6. Practical knowledge of DBT (Data Build Tool) for transforming data into the warehouse.  
  7. Experience working in cloud-based data environments with large-scale datasets.  
  8. Strong problem-solving, communication, and collaboration skills.  

Good-to-Have Skills:  

  1. Experience with DataStage, Netezza, Azure Data Lake, Azure Synapse, or Azure Functions.  
  2. Familiarity with Python or PySpark for custom data transformations.  
  3. Understanding of CI/CD pipelines and DevOps for data workflows.  
  4. Exposure to data governance, metadata management, or data catalog tools.  
  5. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. 

Qualifications:  

  1. Bachelor’s or master’s degree in computer science, Data Engineering, Information Systems, or a related field. 

Exp- 5+ Years 

Location- Noida/Bangalore 

Relevant Functional Experience SFIA Level: 

  1. Hands-on experience supporting SAP Ariba Buying and Invoicing, Sourcing, and Contract Management modules. 
  2. Experience in integrating Ariba with Oracle EBS (AP, PO, and GL modules); SFIA 5.

Role Specific Skills: 

  1. SAP Certified Application Associate- SAP Ariba Procurement. 
  2. SAP Ariba Integration Certification (CIG or API-based) 
  3. Oracle EBS Financials Cloud Certification (optional but valuable) 
  4. ITIL Foundation Certification (for support process alignment) 

Technical/ Soft skills 

  1. Experience in using the ServiceNow ticketing system. 
  2. Strong Experience in Ariba implementation and maintenance. 
  3. Required process knowledge of Procurement & Invoice processing. 
  4. Good communication skills. 
  5. Good analytical skills & competent at logical reasoning. 

Responsibilities 

  1. Oversee sub-function/team activities. 
  2. Responsible for client coordination, issue resolution, and escalation management. 
  3. Resource shift alignment as per the WTW business requirement. 
  4. Monitor key performance parameters of the process & ensure adherence to SLAs/KPIs. 
  5. Partner with WTW on process improvement/ transformation initiatives. 
  6. Lead tri-party governance and ensure timely resolution of issues. 
  7. Review small change forms, provide estimated efforts, seek approval from WTW, and post approval, get the small change implemented. 
  8. Troubleshooting invoice exceptions, PO mismatches, and supplier onboarding issues. 
  9. Managing Ariba Cloud Integration Gateway (CIG) or middleware tools for data exchange. 
  10. Supporting end-users with catalogue management, requisitioning, and approvals. 
  11. Coordinating with suppliers via SAP Business Network for invoice resolution. 
  12. Creating functional documentation, SOPs, and training materials. 
  13. Participating in UAT, regression testing, and system upgrades. 

Experience: 4+ Years  

Location: Pune/Gurugram/Noida/Bangalore/Hyderabad/Chennai (Hybrid)  

About the Role:  

Azure Data Engineer with Insurance Domain knowledge. 

Key Responsibilities:  

  1. Design and develop ETL/ELT pipelines using Azure Data Factory, Snowflake, and DBT.  
  2. Build and maintain data integration workflows from various data sources to Snowflake.  
  3. Write efficient and optimized SQL queries for data extraction and transformation.  
  4. Work with stakeholders to understand business requirements, especially within insurance. 
  5. Processes such as policy, claims, underwriting, billing, and customer data, and translating them into technical solutions.  
  6. Monitor, troubleshoot, and optimize data pipelines for performance and reliability.  
  7. Maintain and enforce data quality, governance, and documentation standards.  
  8. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. 

Must-Have Skills:  

  1. 4+ years of experience in data engineering roles using Azure and Snowflake, DBT.  
  2. Strong experience with Azure Cloud Platform services.  
  3. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines.  
  4. Proficiency in SQL for data analysis and transformation.  
  5. Hands-on experience with Snowflake and SnowSQL for data warehousing.  
  6. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse.  
  7. Experience working in cloud-based data environments with large-scale datasets.  
  8. Strong problem-solving, communication, and collaboration skills. 

Good-to-Have Skills:  

  1. Experience with DataStage, Netezza, Azure Data Lake, Azure Synapse, or Azure Functions.  
  2. Familiarity with Python or PySpark for custom data transformations.  
  3. Understanding of CI/CD pipelines and DevOps for data workflows.  
  4. Exposure to data governance, metadata management, or data catalog tools.  
  5. Knowledge of business intelligence tools (e.g., Power BI, Tableau).  

Qualifications:  

  1. Bachelor’s or master’s degree in computer science, Data Engineering, Information Systems, or a related field. 

Experience: 6+ Years 

Location: Pune/Gurugram/Bangalore/Noida/Hyderabad (Hybrid) 

About the Role:  

  1. Seeking a highly skilled Lead Engineer with deep expertise in AWS, Microsoft Fabric,  
  2. and Microsoft Purview. The ideal candidate will drive architecture, development, and  
  3. implementation of scalable cloud and data governance solutions, while providing  
  4. technical leadership to the engineering team. 

Key Responsibilities:  

  1. Lead the design, development, and deployment of scalable cloud solutions on AWS.  
  2. Architect and implement enterprise data solutions using Microsoft Fabric, including analytics, data pipelines, and integrated lakehouse scenarios.  
  3. Implement and manage data governance, cataloging, and compliance using Microsoft Purview.  
  4. Define technical standards, best practices, and architecture patterns across cloud and data projects.  
  5. Mentor and guide junior engineers, ensuring high-quality, maintainable, and secure code.  
  6. Collaborate with business and data stakeholders to align technical solutions with organizational objectives.  
  7. Drive innovation in cloud, data, and governance initiatives to support strategic business goals. 

Required Skills & Qualifications:  

  1. 6+ years of hands-on experience in software/cloud engineering or data platform engineering.  
  2. Strong expertise in AWS services (EC2, S3, Lambda, RDS, CloudFormation, etc.) and cloud architecture.  
  3. Hands-on experience with Microsoft Fabric for building analytics, ETL pipelines, or lakehouse solutions.  
  4. Experience with Microsoft Purview a MUST.  
  5. Proficiency in programming/scripting languages such as Python, SQL, or .NET.  
  6. Experience with CI/CD pipelines, DevOps practices, and infrastructure as code.  
  7. Strong problem-solving, technical leadership, and stakeholder management skills.  
  8. Excellent communication skills, with experience collaborating across distributed teams. 

Preferred:  

  1. AWS certifications (Solutions Architect, DevOps Engineer) 
  2. Microsoft Fabric or Azure Data certifications. 
  3. Prior experience in financial services, insurance, or healthcare domains. 

Experience: 2+ Years

Location: Noida

Job Summary:

We are looking for a skilled Technical Recruiter with 2+ years of experience in IT hiring. The ideal candidate will be responsible for sourcing, screening, and hiring top technical talent across various domains while ensuring a smooth and efficient recruitment process.

Key Responsibilities:

  1. Handle end-to-end recruitment for technical roles (IT/Engineering).
  2. Source candidates through job portals, Naukri, LinkedIn, and other channels.
  3. Screen resumes and conduct initial HR/technical interviews.
  4. Coordinate with hiring managers to understand job requirements and expectations.
  5. Schedule and manage interview processes.
  6. Maintain candidate pipeline and database.
  7. Ensure a positive candidate experience throughout the hiring cycle.
  8. Track and report recruitment metrics.
Our Values​

Our Work Culture Makes Us Irresistible!​

For us, our talented professionals are invaluable assets, always deserving all the care, support, and pampering for their incredible and selfless contributions. Check out why you should be a part of us!

5 days a Week
5 Days
a Week
employee first
Employee
First
rewards & benefits
Rewards &
Benefits
Medical
Insurance
Learning sessions
learning
Sessions
quarterly trip
Quarterly
Trip
leaves encashment
Leaves
Encashment
Advanced
Resources
Transparent Communication
Transparent Communication
positive environment
Positive
Environment
Groundbreaking Projects
Interested?

Join the Growth Led Environment!​

Step into a growth-driven environment where innovation meets opportunity. Join us to unlock your true potential, collaborate with industry experts, and elevate your career in a dynamic space that fosters learning and success!