No jobs available at the moment.

Work at the heart of change

This is a place to grow, learn and connect. Everything that makes you who you are is welcome here.

Salesforce BA with Service Max

Location: Las Vegas
Job Type: Contract-to-Hire (CTH)
Work Mode: Onsite


Job Overview

We are seeking an experienced Salesforce ServiceMax / Field Service Lightning (FSL) Business Analyst for an onsite role. The ideal candidate will work closely with business stakeholders, field operations teams, and Salesforce delivery teams to gather requirements, design functional solutions, and support the successful implementation of Salesforce Field Service Lightning.


Key Responsibilities

  • Work onsite with business stakeholders, field technicians, dispatchers, and service managers to understand field service processes
  • Elicit, analyze, and document business requirements for Salesforce ServiceMax and FSL solutions
  • Translate business requirements into functional specifications, user stories, and acceptance criteria
  • Design functional solutions covering:
    • Work Orders
    • Service Appointments
    • Dispatcher Console
    • Service Territories
    • Skills and Scheduling Policies
  • Collaborate with architects and developers to ensure solutions align with business needs and Salesforce best practices
  • Support configuration validation, functional testing, and User Acceptance Testing (UAT)
  • Assist with data validation, reporting requirements, and process documentation
  • Facilitate workshops, requirement walkthroughs, and stakeholder reviews
  • Support change management, training, and post-go-live stabilization activities

Required Skills & Experience

  • 6+ years of experience as a Business Analyst, with at least 2+ years focused on Salesforce Field Service Lightning (FSL)
  • Strong understanding of field service business processes and workforce management
  • Hands-on experience with Salesforce Service Cloud and Field Service Lightning features
  • Experience creating user stories, process flows, functional specifications, and acceptance criteria
  • Solid knowledge of Salesforce data model, security model, and reporting capabilities
  • Experience working in Agile/Scrum environments
  • Excellent communication, facilitation, and onsite stakeholder management skills

Preferred Qualifications

  • Salesforce Field Service Lightning Consultant or Business Analyst certification
  • Experience with ServiceMax or other Field Service Management platforms
  • Exposure to integrations with ERP, inventory, or billing systems
  • Experience supporting large enterprise or global Salesforce implementations

Additional Information

  • Contract-to-Hire (CTH) role
  • Candidates must be U.S. Citizens or Green Card holders only
View Job Details
Java Developer

Location: New York City, NY (Onsite)

Job Overview

We are seeking a skilled and motivated Java Developer to join our team in New York City. The ideal candidate will have strong backend development experience, a solid foundation in cloud technologies, and exposure to modern AI-driven development practices. Experience in the retail industry is essential for this role.

Key Responsibilities

  • Design, develop, and maintain scalable backend applications using Java and Spring Boot.
  • Build and integrate RESTful APIs for internal and external services.
  • Work with AWS cloud services to deploy and manage applications.
  • Collaborate with cross-functional teams to deliver high-quality software solutions.
  • Implement best practices for CI/CD, logging, monitoring, and security.
  • Leverage AI tools and technologies to enhance development productivity and innovation.
  • Participate in code reviews, testing, and troubleshooting to ensure optimal performance.

Required Qualifications

  • Proven experience in the retail industry.
  • Strong proficiency in Java and Spring Boot for backend development.
  • Hands-on experience with AWS services, including EC2, S3, Lambda, RDS or DynamoDB, and API Gateway.
  • Experience developing and consuming REST APIs.
  • Working knowledge of Generative AI concepts, including Large Language Models (LLMs), prompt engineering, embeddings, and AI integrations.
  • Familiarity with AI-assisted development tools such as GitHub Copilot, Amazon Q, or similar.
  • Experience using GitHub for version control and team collaboration.
  • Proficiency with Postman for API testing and validation.
  • Solid understanding of CI/CD pipelines, application logging, monitoring, and cloud security fundamentals.

Preferred Qualifications

  • Experience working in agile development environments.
  • Strong problem-solving skills and attention to detail.
  • Excellent communication and collaboration abilities.

Employment Type

  • Full Time
View Job Details
Snowflake Data Engineer

Location - North Carolina (Remote)

Job Description

  • Design develop and optimize data warehousing solutions using Snowflake to meet enterprise data requirements

  • Utilize ANSISQL for complex queries and data manipulation tasks within Snowflake environments Implement data transformation pipelines with DBT to ensure reliable and maintainable data workflows

  • Collaborate with crossfunctional teams to build scalable and efficient data architectures

  • Ensure adherence to data security governance and compliance standards within cloud data platforms Analyze complex business requirements and translate them into technical solutions leveraging Snowflake and DBT

  • Stay updated with emerging technologies and industry best practices related to Snowflake ANSISQL and DBT

Roles and Responsibilities 

  • Architect and lead the implementation of Snowflake data warehouse solutions aligned with business objectives

  • Develop and enforce best practices for Snowflake and DBT usage including performance tuning and cost optimization

  • Collaborate closely with data engineers analysts and stakeholders to deliver highquality reliable data products

  • Oversee data migration integration and transformation processes within Snowflake environments Conduct code reviews troubleshoot issues and optimize existing Snowflake and DBT implementations

  • Provide technical leadership and strategic direction on Snowflake and DBT integration Lead training sessions and knowledgesharing workshops to enhance team expertise in Snowflake ANSISQL and DBT technologies Engage with senior management to align data platform initiatives with organizational goals

Additional Information:

  • Only U.S. citizens are eligible.

Employment Type: 

  • Contract.
View Job Details
Technical Product Manager

Location: Eden Prairie, MN(Hybrid)

Key Responsibilities

Research internal and external customers' business practices and needs relative to new features and technical feasibility

A strategic thinker who can move strategy into reality and link business context to technical reality

Assess business rules

Analyses complex business and system requirements

Proficient level of experience in discovery, requirement definition and user story definition

Maintain market awareness (competitors, market conditions & activities, key & potential customers, competing and complementary technologies)

Trains and educates business and product team relative to architecture design, roadmap prioritizations

Required Skills & Qualifications

Bachelor's degree in computer science, Business, or a related field

Experience in product tools including Aha and Rally 

8+ years of hands-on experience in Business analysis

Experience in driving product development lifecycles in the Healthcare domain

Able to articulate detailed product overview, benefits, and associated personas

Collaborate with business analysts and functional teams to understand requirements and translate them into technical solutions.

Document technical specifications, configuration steps, SOPs and user guides.

Works with the Client TPMs to understand the roadmap and business requirements

Good understanding of Agile

Ensures the Story groomed to great details for Dev teams

View Job Details
Licensed Social Worker

Location: Trenton, New Jersey, 08628

Provides individual, group and family therapy.

Arranges the schedule for mental health clinic sessions within an assigned area.

Interviews persons referred to the clinic for examination; meets with families, the representatives of public and private agencies, schools, police departments, and other interested persons; obtains significant social, psychological, psychiatric and other data needed for diagnosis and treatment.

Prepares detailed and accurate histories of patients; incorporates pertinent information from outside agencies.

Assigns patients to examiners.

Provides explanations of the services and facilities, offered through the clinic, to patients and their families, social workers, teachers, judges and/or other interested persons.

Interprets the recommendations of the clinic staff to patients and their families, social workers, teachers, and other interested persons.

Provides guidance and assistance to persons in need of psychiatric care and counseling.

Functions as a consultant to private and public welfare agencies and organizations that refer patients to the clinic.

Provides assistance to patients needing help to make satisfactory adjustments; speaks with the parents and other responsible adults on behalf of pediatric patients.

Conducts discussions before lay, professional, and other groups to promote the mental health programs of the agency.

Prepares detailed correspondence in the course of official duties.

Prepares clear, technically sound, accurate, and informative psychiatric and other related reports containing findings, conclusions, and recommendations.

Maintains essential records, reports, and files.

Will be required to learn how to utilize various types of electronic and/or manual recording and information systems used by the agency, office, or related units.

View Job Details
Java Solution Architect

Client: HTC Global Services
Location: Chennai / Hyderabad (Hybrid)
Experience: 12 – 15 Years
Salary: ₹44,00,000 – ₹50,00,000 per annum
Interview Process: 3 Rounds (Face-to-Face Mandatory)
Shift Timing: General Shift


Job Overview

We are seeking a highly experienced Java Solution Architect with deep expertise in modern application architecture, including cloud-native design, microservices, and DevSecOps practices. The ideal candidate is a strategic thinker with strong hands-on capabilities, responsible for designing scalable, resilient, and observable enterprise-grade systems using Java and related technologies.


Key Responsibilities

  • Lead architecture, design, and implementation of modern Java-based applications using microservices and cloud-native patterns.
  • Translate business requirements into robust architectural solutions aligned with enterprise standards.
  • Drive best practices in scalability, performance, security, and observability.
  • Collaborate with cross-functional teams including Development, DevOps, QA, and Business stakeholders.
  • Provide architectural governance and technical leadership across multiple projects.
  • Define and implement API-first strategies using REST, GraphQL, and event-driven architectures.
  • Conduct architecture reviews, Proof of Concepts (PoCs), and technology evaluations.
  • Guide teams on containerization (Docker) and orchestration platforms like Kubernetes.
  • Establish and manage observability frameworks using tools such as Prometheus, Grafana, ELK Stack, OpenTelemetry, Jaeger, Zipkin, or Dynatrace.
  • Stay updated with emerging technologies and assess their relevance to enterprise architecture.

Technical Skills & Requirements

  • Strong expertise in Java 11+, Spring Boot, and Microservices Architecture.
  • Experience with at least one major cloud platform: AWS, Azure, or GCP.
  • Hands-on experience with Docker and orchestration tools like Kubernetes (EKS/AKS/GKE).
  • Experience with API Gateways and service mesh technologies such as Istio or Linkerd.
  • Strong understanding of RESTful APIs, GraphQL, gRPC, and asynchronous messaging systems (Kafka, RabbitMQ).
  • Experience with NoSQL (MongoDB, Cassandra) and relational databases, including distributed data modelling.
  • Proven ability to design systems for resilience, scalability, and observability.
  • Experience with DevOps tools like Jenkins, GitOps, ArgoCD, Terraform, etc.
  • Strong knowledge of logging, tracing, and monitoring frameworks.

Must-Have Skills

  • Minimum 5+ years of hands-on experience in:
    • Java 11+
    • Spring Boot
    • Microservices Architecture
    • Cloud Platforms (Azure or GCP preferred; AWS acceptable)
  • Hands-on experience in:
    • Docker & Kubernetes
    • API Gateways & Service Mesh (Istio/Linkerd)
    • REST, GraphQL, gRPC
    • Messaging systems (Kafka, RabbitMQ)
    • Observability tools and DevOps toolchains

Additional Guidelines

  • Hybrid work model (Chennai / Hyderabad)
  • Candidate must be available for Face-to-Face interviews
  • Strong communication and stakeholder management skills required
View Job Details
MLOps Engineer

Client: HTC Global Services
Location: Bangalore / Chennai / Hyderabad, India
Experience: 6 – 10 Years
Salary: ₹19,00,000 – ₹25,00,000 per annum
Work Mode: Hybrid (2nd Shift)
Joining Timeline: Immediate to 15 days
Interview Process: 3 Rounds


Job Summary

We are seeking a skilled MLOps Engineer to design, deploy, and manage scalable machine learning systems in production environments. The ideal candidate will have strong experience in ML engineering, DevOps, and data engineering, with a focus on building reliable pipelines, automating workflows, and ensuring model performance and stability.


Key Responsibilities

Model Deployment

  • Deploy machine learning models into production using scalable frameworks
  • Build APIs and batch/real-time data pipelines
  • Containerize ML models using Docker
  • Orchestrate workflows and deployments using Kubernetes

CI/CD for Machine Learning

  • Design and implement CI/CD pipelines for ML workflows
  • Automate model training, validation, testing, and deployment
  • Implement version control for code, datasets, and models

Monitoring & Maintenance

  • Monitor model performance, accuracy, and drift
  • Ensure system reliability, scalability, and performance
  • Troubleshoot production issues and optimize pipelines

Required Skills & Qualifications

  • Bachelor’s or Master’s degree in Computer Science, Data Science, or a related field
  • 6+ years of experience in ML Engineering, DevOps, or Data Engineering
  • Proven experience deploying ML models in production environments
  • Strong understanding of microservices architecture
  • 5+ years of hands-on experience in ML, MLOps, DevOps, and Python (mandatory)

Technical Skills

Programming:

  • Python (mandatory)
  • Familiarity with Java or Scala (preferred)

ML Frameworks:

  • TensorFlow, PyTorch, Scikit-learn

MLOps Tools:

  • MLflow, Kubeflow, or SageMaker
  • Airflow or Prefect

Cloud Platforms:

  • AWS, Azure, or GCP

Containerization & Orchestration:

  • Docker, Kubernetes

CI/CD Tools:

  • Jenkins, GitHub Actions, or GitLab CI

Data Technologies (Preferred):

  • SQL, Spark, Kafka

Must-Haves

  • 6+ years of relevant experience in ML Engineering / DevOps / Data Engineering
  • Strong expertise in Python, ML, MLOps, and DevOps practices
  • Experience deploying and managing ML models in production
  • Understanding of microservices architecture
  • Ability to join within 15 days

Nice-to-Have Skills

  • Experience with SQL, Apache Spark, and Kafka
  • Exposure to large-scale distributed data systems
View Job Details
Sr. Applications Analyst – Salesforce / Conga CLM / X-Author

Location: Bangalore / Chennai / Hyderabad, India
Experience: 6 – 8 Years
Salary: ₹20,00,000 – ₹28,00,000 per annum
Joining Timeline: Immediate to 1-week joiners only
Interview Mode: Video Discussion
Client Name: HTC GLOBAL SERVICES


Job Overview

We are looking for a highly skilled Sr. Applications Analyst with strong expertise in Salesforce, Conga CLM, and Conga X-Author for Excel (mandatory). The role involves leading end-to-end application design, development, enhancements, and maintenance initiatives.

The ideal candidate will collaborate with business stakeholders to design scalable solutions, optimize business processes, and ensure high application performance and reliability.


Key Responsibilities

  • Analyze business requirements and design scalable solutions on the Salesforce platform
  • Architect and configure solutions for Conga CLM, including contract lifecycle processes such as creation, approvals, workflows, renewals, and integrations
  • Design and implement solutions using Conga X-Author for Excel for data management, reporting, and document automation
  • Develop and maintain Salesforce customizations (flows, workflows, reports, dashboards, security models, automation)
  • Perform application enhancements, defect resolution, and continuous improvements
  • Create and maintain functional and technical design documentation
  • Support integrations between Salesforce and enterprise applications
  • Participate in testing, release management, deployment, and production support activities
  • Ensure adherence to best practices, governance, and security standards
  • Collaborate with cross-functional teams (business, IT, vendors) in an Agile environment

Required Skills & Qualifications

  • 6+ years of experience in Salesforce design, development, and application maintenance
  • Strong hands-on experience in Salesforce Administration and Configuration
  • Mandatory experience in Conga CLM and Conga X-Author for Excel
  • Solid understanding of Salesforce architecture (objects, flows, reports, dashboards, profiles, permission sets)
  • Good understanding of Contract Lifecycle Management (CLM) processes
  • Strong analytical, solution design, and problem-solving skills
  • Excellent communication and stakeholder management abilities

Preferred Skills

  • Salesforce Admin / Platform Certifications
  • Experience with Apex, Lightning Components, APIs, and integrations
  • Exposure to Agile / Scrum methodologies

Must-Haves

  • 6+ years of Salesforce experience (design, development, maintenance)
  • Strong Salesforce Administration / Configuration expertise
  • Hands-on experience in Conga CLM and Conga X-Author for Excel (mandatory)
  • Strong understanding of Salesforce architecture and CLM processes
  • Excellent problem-solving and communication skills
  • Availability to join within 1 week
View Job Details
Site Reliability Engineer (SRE) – GCP & Jenkins

Location: Trivandrum / Pune (Hybrid – 3 Days WFO) – Any of these UST offices.
Employment Type: Full-Time
Positions: 2
Experience: 5+ Years’
Salary Range: ₹22–28 LPA
Notice Period: Immediate to 15 Days


Job Overview:

We are looking for a highly skilled Site Reliability Engineer (SRE) with strong expertise in Google Cloud Platform (GCP) and Jenkins (Declarative Pipelines). This role focuses on building scalable CI/CD pipelines, enhancing system reliability, and driving automation across the full software development lifecycle.


Key Responsibilities:

CI/CD & Automation:

  • Design and implement end-to-end CI/CD pipelines using Jenkins (Declarative Pipelines)
  • Automate build, test, and deployment workflows
  • Optimize pipeline performance, scalability, and security

GCP Infrastructure & Cloud Operations:

  • Manage and automate deployments across GCP services (GKE, Cloud Run, Compute Engine, Cloud SQL, VPC, Cloud Storage)
  • Ensure secure, scalable, and cost-optimized cloud infrastructure
  • Maintain cloud networking, compute, and storage configurations

Reliability & Monitoring:

  • Implement monitoring, logging, and alerting systems
  • Define and track SLIs, KPIs (availability, latency, performance)
  • Improve system resiliency, scalability, and performance

Incident & Lifecycle Management:

  • Handle P1/P2 incidents with quick resolution
  • Ensure proactive monitoring and preventive automation
  • Support capacity planning, load testing, and system readiness

Collaboration & Governance:

  • Work closely with development and DevOps teams
  • Ensure adherence to security and compliance standards
  • Maintain documentation and architectural best practices

Must-Have Skills (Strict Screening):

  • 5+ years of experience as an SRE (mandatory)
  • Strong hands-on experience with Google Cloud Platform (GCP)
  • Proven expertise in Jenkins (Declarative Pipelines)
  • Strong understanding of CI/CD architecture and automation
  • Experience in monitoring tools (Prometheus, Grafana, etc.)
  • Strong scripting skills (Shell/Python)
  • Ability to work independently and own deliverables
  • Good stability in career (mandatory)
  • Notice Period: 0–15 days only (strict)
  • Willing to work from Trivandrum/Pune (Hybrid model)

Good-to-Have Skills:

  • Experience with Kubernetes (GKE)
  • Exposure to Cloud Functions, Cloud Run
  • Knowledge of infrastructure as code (Terraform preferred)
  • Experience with high-availability and distributed systems

Recruiter Screening Guidelines:

  • Validate hands-on GCP experience (not just basic exposure)
  • Check real Jenkins Declarative pipeline implementation
  • Assess incident handling (P1/P2 scenarios)
  • Ensure candidate has worked in production environments
  • Strong focus on ownership and independent working capability

Interview Process:

  • 2 Technical Rounds
  • 1 Client Round
View Job Details
Data Architect – Adobe Experience Platform (AEP)

Location: Remote


Role Overview

We are seeking an experienced Data Architect with deep expertise in data modeling, ETL processes, and large-scale data systems. In this role, you will design and implement customer-centric data solutions for Adobe Experience Platform (AEP), working closely with enterprise clients and internal teams to enable advanced customer analytics and data-driven decision-making.


Key Responsibilities

  • Engage with enterprise customers to gather requirements, design data solutions, and provide strategic recommendations
  • Lead client discussions, workshops, and project calls, coordinating with Project Managers as needed
  • Define and deliver detailed technical specifications and data architecture documentation
  • Design scalable and efficient data schemas for Adobe Experience Platform
  • Build and optimize data ingestion pipelines for large and complex datasets
  • Structure and model data to enable dynamic, customer-level analytics and insights
  • Develop processes for Customer ID mapping to create a unified 360° customer view across multiple data sources
  • Automate data workflows including data movement, transformation, and cleansing using scripting languages
  • Collaborate with onshore and offshore engineering teams to deliver high-quality solutions
  • Track, manage, and forecast effort across multiple customer engagements
  • Contribute innovative ideas to enhance data architecture and solve complex customer challenges

Required Skills & Experience

  • 10+ years of experience in data transformation and ETL processes on large-scale datasets
  • 5+ years of hands-on experience in data modeling:
    • Relational, Dimensional, Columnar, and Big Data models
  • Strong expertise in SQL and/or NoSQL databases (5+ years)
  • Solid understanding of advanced Data Warehousing concepts
  • Experience with industry-standard ETL tools such as Informatica or Unifi
  • Experience designing customer-centric datasets (CRM, Call Center, Marketing, POS, Offline data, etc.)
  • Familiarity with reporting and visualization tools such as Tableau or Power BI
  • Experience in business requirements gathering, structured analysis, and process design
  • Strong background in professional software development practices
  • Excellent organizational and multitasking abilities across multiple projects
  • Strong communication skills to work with customers, sales teams, and technical stakeholders
  • Self-driven, proactive, and customer-focused mindset

Preferred / Good to Have

  • Experience with Adobe Experience Cloud, especially Adobe Experience Platform (AEP)
  • Knowledge of Digital Analytics and Digital Marketing ecosystems
  • Programming experience in Python, Java, or Bash scripting
  • Hands-on experience with Big Data technologies such as:
    • Hadoop, Spark, Redshift, Snowflake, Hive, Pig
  • Experience working as an enterprise technical consultant or solutions architect

Education

  • Bachelor’s degree in Computer Science, Information Systems, Data Science, or a related field

Competency Area

  • AEP (Adobe Experience Platform)
  • Role: Data Architect
View Job Details
Backend Engineer (Java)

Location: Bengaluru (Hybrid – 2–3 days/week from Varthur office)
Shift: UK Shift (2:00 PM - 11:00 PM IST)


Job Overview

We are looking for a skilled Backend Engineer with strong expertise in Java and cloud-native application development. The ideal candidate will have hands-on experience in designing scalable backend systems, building RESTful services, and working with modern development practices such as Test-Driven Development (TDD) and CI/CD pipelines.


Key Responsibilities

  • Design and develop robust, scalable server-side architecture and technical documentation
  • Collaborate with stakeholders to understand business requirements and translate them into technical solutions
  • Build and maintain RESTful APIs and backend services
  • Write clean, efficient, and maintainable code following best practices
  • Ensure code quality through reviews, testing, and adherence to development standards
  • Debug, troubleshoot, and resolve production and application issues
  • Proactively support end users and address technical concerns
  • Define and implement coding standards and development processes

Required Skills & Experience

  • 6–8 years of experience in backend development
  • Strong proficiency in Java (SE 12+) with hands-on experience in:
    • Generics, Reflection, Multithreading
    • Annotations, Functional Programming, Lambda expressions
    • Java Stream API
    • JPA and JMS
  • Solid experience with Spring Framework:
    • Spring Core, Spring MVC, Spring Boot
  • Experience with ORM tools such as Hibernate
  • Hands-on experience in building and consuming RESTful APIs
  • Strong understanding of database systems, preferably MS SQL, including:
    • Database design
    • Query optimization
  • Experience with build and CI/CD tools:
    • Jenkins, Maven, Gradle
  • Proficiency in version control systems such as Git
  • Understanding of application security concepts:
    • Authentication, Authorization, Data protection

Cloud & Additional Skills (Good to Have)

  • Experience with AWS Services:
    • SQS, Lambda, S3
    • Parameter Store, Secrets Manager
    • ECS
  • Familiarity with API testing and documentation tools such as Postman or Swagger
  • Experience in cloud-native development and microservices architecture

Development Practices

  • Test-Driven Development (TDD)
  • Design-Driven Development (DDD)
  • Cloud-Native Development

Education

  • Bachelor’s degree in Engineering (BE) or equivalent
View Job Details
VC++ (MFC) Developer

Location: Chennai, India
Employment Type: Full-Time
Positions: 2
Experience: 7–16 Years
Salary: 20-40 LPA (As Per the Experience)
Notice Period: 30–60 Days


Job Overview:

We are hiring experienced VC++ (MFC) Developers to work on high-performance, system-level applications supporting advanced hardware systems in a product-based environment. This role involves developing, enhancing, and maintaining software that interfaces with complex devices and contributes to mission-critical systems.


Key Responsibilities:

  • Develop and maintain applications using Microsoft Visual C++ (VC++) and MFC
  • Work on system-level programming in a Win32 environment
  • Design and implement features to interface with hardware devices (COM, USB, PCI, Ethernet)
  • Collaborate with cross-functional teams to define and deliver software requirements
  • Debug, troubleshoot, and optimize multithreaded applications
  • Ensure code quality through reviews and best practices
  • Improve application performance, scalability, and reliability
  • Participate in the full SDLC (design to deployment)

Must-Have Skills (Strict Screening):

  • 7–16 years of experience in software development
  • Minimum 5+ years of hands-on C++ (VC++) experience
  • At least 1+ year of strong MFC experience (mandatory)
  • Solid experience in Win32 API / Windows environment
  • Strong fundamentals in Data Structures & Algorithms
  • Experience in multithreading and system-level programming
  • Must be from a Product-Based Company (mandatory)
  • Good stability (minimum 2 years per organization)
  • Strong debugging and problem-solving skills
  • Willingness to relocate to Chennai
  • Notice Period: 30–60 days only
  • No HCL candidates (strict)

Good-to-Have Skills:

  • Knowledge of STL, SQL, XML, TCP/IP Sockets
  • Experience with hardware interfacing (COM ports, USB, PCI cards)
  • Exposure to Modbus, SECS/GEM protocols
  • Understanding of Windows Kernel-Mode Drivers
  • Experience in industrial automation or semiconductor domain

Education:

  • B.Tech / M.Tech / MS / MSc (Mandatory) in Computer Science or related field
View Job Details
Cloud / Solution Architect (Japanese Language Required)

Location: Any Metro City (Hybrid – 3 Days Work From Office)

Job Overview

We are seeking an experienced Cloud / Solution Architect with strong expertise in designing and implementing cloud-based solutions. The ideal candidate will have deep knowledge of cloud platforms along with proficiency in the Japanese language to effectively collaborate with global stakeholders.

Key Responsibilities

  • Design, develop, and implement scalable and secure cloud architectures on AWS, Azure, or GCP
  • Lead cloud migration and modernization initiatives
  • Collaborate with cross-functional teams to define technical solutions aligned with business goals
  • Ensure best practices in cloud security, networking, and DevOps
  • Provide technical leadership and guidance to development and operations teams
  • Work closely with international (especially Japanese) clients to gather requirements and deliver solutions

Required Skills & Qualifications

  • 12+ years of overall IT experience with a strong focus on cloud technologies (AWS/Azure/GCP)
  • Proven experience in cloud architecture design and migration strategies
  • Solid understanding of networking, security, and DevOps practices
  • Proficiency in Japanese language (mandatory)
  • Japanese Language Certification: JLPT N2 or N1 (required)
  • Experience working with international clients is highly preferred

Additional Information

  • Immediate joiners will be given preference
View Job Details