Careers
- Home
- Careers
Job Tittle: Software Engineer
Job ID: 11178
Job Type: Permanent / Full Time
Job Description:
Job duties:
- Design, develop, and operate high scale applications focusing on operational excellence, security and scalability.
- Involve in entire software development life cycle (SDLC) activities of the project including, requirement gathering from the business owners of the product, performing analysis on the requirements, providing design solutions, development, testing and support operations.
- Create technical specifications and document application functionality as reference for future maintenance and upgrades.
- Develop User Interfaces with clear and attractive designs with intuitive navigations using HTML, Javascript, Angular.
- Develop and implement Microservices using Java, Spring Boot, etc. •Develop GRPC web services using NodeJS.
- Utilize SQL to Insert, Retrieve and Update the information in the MySQL database and develop stored procedures and triggers using PL/SQL.
- Write test cases for applications using JUNIT, Chai and test client-side apps with Karma, Jasmine and related.
- Develop, test and deploy updates and bug fixes to the application using CI/CD pipelines. Use Git-based platforms like GitHub, GitLab, and Bitbucket for code hosting and collaboration.
- Participate in a tight-knit engineering team employing agile software development practices.
Requirements:
Job Tittle: Data Engineer
Job Type: Permanent / Full Time
Available positions : 6
Job Description :
- As a Senior Data Engineer, you’ll have a key role in building and designing the strategy of our finance analytics engineering team under the Enterprise Data Engineering group.
- Our Technological Stack includes: Airflow, DBT, Python, Snowflake, AWS, GCP, Amplitude, Fivetran, and more.
Roles and Responsibilities :
- Building, and continuously improving our data gathering, modeling, reporting capabilities and self-service data platforms.
- Working closely with Data Engineers, Data Analysts, Data Scientists, Product Owners, and Domain Experts to identify data needs.
Qualifications Required:
- Relevant Bachelor degree – preferably CS, Engineering/ Information Systems or other equivalent Software Engineering background.
- 8+ years of experience as a Data/BI engineer.
- Strong SQL abilities and hands-on experience with SQL and no-SQL DBs, performing analysis and performance optimizations.
- Hands-on experience in Python or equivalent programming language Experience with data warehouse solutions (like BigQuery/ Redshift/ Snowflake) Experience with data modeling, data catalog concepts, data formats, data pipelines/ETL design, implementation and maintenance.
- Experience with AWS/GCP cloud services such as GCS/S3, Lambda/Cloud Function, EMR/Dataproc, Glue/Dataflow, Athena.
- Experience with Airflow and DBT – Advantage.
- Experience with data visualization tools and infrastructures (like Tableau/SiSense/Looker/other) – Advantage.
- Experience with development practices – Agile, CI/CD, TDD – Advantage.
- Experience with Infrastructure as Code practices – Terraform – Advantage
Job Tittle: Snowflake Developer
Job Type: Permanent / Full Time
Available positions : 15
Qualifications Required:
- Minimum 7+ Years exp Snowflake design patterns and migration architectures.
- Thought leadership, strategy, and direction for the Snowflake practice.
- Hands on experience on on-prem to snowflake migration in at least one project.
- Understand of Snowflake roles, user security, Account Admin.
- Requirement gathering and Analysis.
- Snowflake SQL Writing SQL queries against Snowflake Developing scripts to do Extract, Load, and Transform data.
- Programming language expertise like shell scripting, Python etc for ETL/ELT.
- Understand of Snowflake roles and user security.
- Understanding of Snowflake capabilities like Snowpipe, STREAMS, TASKS etc.
- Design and Implement cloud data platform and cloud related architectures.
- Design solution leveraging Snowflake native capabilities.
Job Tittle: Databricks Developer
Job Type: Permanent / Full Time
Available positions : 17
Job Description:
Roles & Responsibilities :
- Work with data engineering team to define and develop data ingestion, validation, transformation, and data engineering code.
- Develop open-source platform components using Databricks, Hadoop, Spark, Scala, Java, Oozie, Hive and other components.
- Deliver on cloud platforms and integrate with services such as Azure Data Factory, ADLS, Azure DevOps, Azure Functions, Synapse or AWS Glue, Redshift, Lambda, and S3.
- Document code artifacts and participate in developing user documentation and run books.
- Troubleshoot deployment to various environments and provide test support.
- Participate in design sessions, demos, and prototype sessions, testing and training workshops with business users and other IT associates.
Qualifications Required:
- At least 2+ years of experience in developing large scale data processing/data storage/data distribution systems, preferably Databricks.
- At least 3+ years of experience on working with large Hadoop projects using Spark and Python and working with Spark DataFrame, Dataset APIs with SparkSQL as well as RDDs and Scala function literals and closures.
- Hands-on experience with Hadoop, Hive, Sqoop, Oozie, HDFS. Great SQL Skills.
- Experience with ELT/ETL development, patterns and tooling is recommended.
- Experience with AWS and/or Azure cloud environments.
- Experience with SQL including Postgres, MySQL RDBMS platforms.
- Experience with Linux (RHEL or Centos preferred) environments.
- Experience with various IDE and code repositories as well as unit testing frameworks.
- Experience with code build tools such as Maven.
- Fundamental knowledge of distributed data processing systems and storage mechanisms.
- Ability to produce high quality work products under pressure and within deadlines with specific references.
- Strong communication and collaborative skills.
- At least 5+ years of working with large multi-vendor environment with multiple teams and people as a part of the project.
- At least 5+ years of working with a complex Big Data environment.
- 5+ years of experience with JIRA/GitHub/Git and other code management toolsets.
Preferred Skills and Education:
- Bachelors’ degree in Computer Science or related field.
- Certification in Spark, Databricks, AWS, Azure or other cloud platform.
Job title: Staff Software Engineer(Data Modeling/SQL + Visualization + Analytics)
Job Type: Permanent / Full Time
Skills and qualifications:
- Bachelor’s degree in Computer Science, Mathematics, Statistics, or related field
- Experience in leading an engineering team
- 8+ years of hands on Data Modeling / SQL development experience
- Advanced SQL expertise (query creation, Windowing and tuning)
- Experience with big data analytics or real time analytics solutions
- Experience with Relational Databases (e.g. Oracle, MySQL) and/or NoSQL databases (eg HBase, MongoDB)
- Experience with Agile (eg Scrum) and test driven development
- Expertise in Data Structures, Algorithms and Concurrency
- Experience with Looker’s LookML (modeling language)
- Experience building Microservices and APIs
- Experience with Amazon Web Service (EC2, S3) or Google Cloud or Azure
- Experience with Cloudera Impala, Hive, Hibernate
- Hands-on experience with Hadoop, Spark, Kafka, ElasticSearch
- Experience with Continuous Integration (CI) and Continuous Delivery (CD)
- Contributing to an open source community
Job title: Sr Cloud Network Security Architect
Job Type: Permanent / Full Time
- AWS-specific skills (Networking: vpc, virtual gateway, Route53, Direct Connect, transit vpc, transit gateway, lambda, endpoints, load balancers) and (Security: WAF, Config, CloudWatch, etc.)
- Azure-specific skills (Networking: vnet, vnet peering, udr, sdr, expressroute, nsg, load balancers, endpoints.)
- Experience with automated configuration and deployment: Terraform or other Infrastructure as Code (IAC) frameworks
- Experience with distributed version-control systems: git/github
- 5+ years of Strong Enterprise networking with Routing/Switching configuration/diagnostic experience in Global Network infrastructure design delivery of WAN, LAN, Firewall, and F5.
- Experience with Cisco hardware and OS : Catalyst switches, ISR/ASR routers, ASA
- Strong practical experience with Palo Alto firewalls is highly desired
- Strong understanding of the following Network protocols: BGP, IPSec and IPSec VTI VPN
- Experience and in-depth understanding of TCP/IP packets with ability to analyze captured packets for deep troubleshooting.
- Scripting (Python, Ansible, Tower) experience is a plus
- Work closely with the Network architecture, security and application teams to rollout new designs and perform activities for supporting cloud application migration projects.
- Leverage his/her prior experience with Azure and AWS to implement global connectivity solutions.
- Implement an automated process for cloud network environment eliminating manual and repetitive tasks
- Create and maintain Infrastructure as Code (IAC) using industry standard platforms.
- Implement industry standard cloud network security practices during build activities and maintain it throughout the lifecycle.
- Perform functional testing to verify implementation meet production acceptance standards
- Provide support of cloud network services for complex issues
Job title: Senior Manager, DevOps & QA
Job Type: Permanent / Full Time
At Kivyo the Senior Manager of DevOps and QA will lead the strategic vision, execution, and business unit adoption/adherence of the Development Operations and Quality Assurance standards and practices in Prescription Automation.
The Senior Manager will own the entirety of the release management lifecycle inclusive of software planning, testing, IT Security, software updates, hardware upgrades, strategic projects and programs. You will be responsible for architecting, implementing, and managing quality and release processes for code through development, test, and production environments.
Role and Responsibilities
- Lead the DevOps strategy, in support of frequent, iterative builds for multiple development teams into test environments. The focus of the team is to automate manual processes and improve efficiency across the development lifecycle.
- Prioritize DevOps and QA Team initiatives and projects to optimize efficiency and quality
- Collaborate with multiple development teams to identify needs for building and deploying their applications into test environments
- Facilitate, manage and communicate information related to server environments – their current state, deployed versions and availability to developers, QA engineers and team managers
- Identify and deploy infrastructure for testing environments.
- Implementing DevOps, Application Lifecycle Management and Agile Solutions
- Contribute to implementation and adherence to continuous integration practices, including build automation, test automation, fast builds and clean build maintenance.
- Enhance and Integrate the Portfolio management software products into a CI and CD process using supportable tools.
- Own and execute DevOps software configuration and release activities in central Git repositories and CI/CD – branching, tagging, building, releasing
- Software code security compliance
- Measure and monitor release progress to ensure application releases are delivered on time and within budget, and that they meet or exceed expectations
- Conduct Release Readiness reviews, Milestone Reviews, and Business Go/No-Go reviews
- Lead and co-ordinate the Go-Live activities including the execution of the test and deployment plans and checklists.
- Maintains a release repository and manages key information such as build and release procedures, dependencies, and notification lists
- Develop scripted manual tests to deterministically assure that the software system is performing as specified by the requirements
- Develop and execute test cases, scripts, plans, and procedures (manual and automated)
- Perform manual testing, reporting bugs in the issues tracking system when necessary.
- Oversee the creation of Automated test scripts
- Maintain virtual environments to replicate customer deployments for testing
- Collaborate cross functionally with software engineers, business analysts, product managers, implementation specialists, and project managers to provide testing assistance.
- Utilize proven software quality assurance methodologies and techniques to ensure the highest quality software
- Improve software quality by appropriately challenging the status quo
- Maintain and expand automated test infrastructures for Microsoft .Net environment using Azure DevOps
- Oversee quality assurance including establishing metrics, applying industry best practices, and developing new processes to ensure quality goals are met.
- Provide test harnesses and participate in FAT (Factory Acceptance testing) where appropriate.
Qualifications:
- Bachelor’s degree in Computer Science or related field
- 5 or more years of management experience
- 2-6 years of previous release and/or project management experience
- 8-10 years of experience in information systems operations environment in systems analysis or development
- 2+ year: Hands-On Pharmacy Automation Experience
- Advanced knowledge of software development lifecycle
- DevOps operations and tools experience. Experience with Azure DevOps is preferred
- Demonstrated ability to coordinate cross-functional work teams toward task completion
- Demonstrated effective leadership and analytical skills
- Advanced written and verbal communication skills are a must
Job title: Cloud Operations Engineer
Job Type: Permanent / Full Time
Job Description
Manage the cloud infrastructure environment through cross technology administration (OS, databases, virtual networks), execution of scripting and monitoring automations. Manage the environment incidents with a focus on service restoration. Act as operations support for all computer, network, storage, security, or automation incidents /requests.
Key Responsibilities: 1 High-level design documentation for using Amazon2 Azure cloud connectivity technologies VPC, direct connect3 Intra and Inter cloud routing plan, BGP Peering, CIDR blocks, Security groups, Network access control list4 Modify firewall policy, Transit Gateway TGW/vWAN, Cloud Express-route/Direct
Technical Experience: 1 Experience of working on AWS services: VPC, VPN, EC2, RDS, EastiCache, CloudFront, Load balancers, Autoscaling, Backup, Lambda2 Experience of Working on Azure Services: Virtual Machine, Blob Storage Block , Cool , Archive , ASR, StorSimple, RediCache, SQL Datawarehouse, Load Balance Application Gateway, CDN, Azure DNS, Express Route, Virtual Network, Log Analytics etc
Professional Attributes: Must be able to handle day to day cloud and security operations efficiently with a team of 15 resources Security Operation Center NOC support for client environment Excellent communication
Educational Qualification: BE/Btech/Computer science or equivalent IT experience Should have 5 years of experience
Added Advantage: Certifications: AWS/AZURE