Displaying search results for ""

Big Data Tester

Job Description:

  • Big Data Testing on Hadoop, Hive, etc., using HiveQL, MapReduce or other big data technologies for automated validation
  • Implementing DevOps / CI-CD using Bitbucket, Jenkins, etc.,
  • Implementing Test Automation Frameworks for back-end validation including RDBMS, file, Kafka sources and Big Data targets
  • Working in an Agile environment on Jira, ALM, Confluence, etc.,
  • Big Data technologies (Hadoop, Hive, etc.,) using HiveQL, Java-MapReduce / Pig-Latin / Spark-Python or other big data technologies on Linux
  • Experience with SQL and Shell Scripting experience

Desired Candidate Profile:

  • Ability to solve problems and identify opportunities through analytical thinking.
  • Good knowledge of Advanced MS Excel tools is a must for reporting and analysis.
  • Strong attention to detail and has ability to learn quickly.
  • Excellent communication skills with good command over English language (verbal & written).
  • Ability to work independently and in a team environment.
View

Test Delivery Manager

Responsibilities:

  • Providing strong coverage to Data Validation projects ( ETL, Big Data,  Analytics, AI, Reporting) for the global customers.
  • Managing a team size of 30+ in Data validation projects
  • Demonstrating the improvements in CSAT, Delivery Excellence and Operations KPIs
  • Providing automation expertise.
  • Demonstrate high energy, integrity, and track record of exceeding performance expectations in generating revenue and client relationship management.
View

ETL Developer

Job Description:

  • ETL /Ab Initio Skills: Development experience in GDE, Express-It or any other ETL tool
  • Exposure to Data Analysis
  • Experience in Requirements Analysis, Data Analysis, Application Design, Application Development, Implementations and Testing of Data warehousing business systems.
  • Solid experience in developing Ab Initio applications for Extraction, Transformation, Cleansing and Loading into Data Warehouse/Data mart.
  • Strong experience on developing re-usable generic applications for high-volume DWH environment.
  • Experience on development and maintenance of schemas, Fact Tables, Dimensional Tables (Type 1, Type 2, Type 3), and Surrogate Keys.
  • Extensively used Ab Initio EME data store/sandbox for version control, code promotion, and impact analysis or any other ETL tool
  • Experience in providing Business Intelligence solutions in Data Warehousing in Windows and UNIX platforms.
  • Experience in writing UNIX Shell Scripts.
  • Experience in integration of various data sources with Multiple Relational Databases like Oracle, SQL Server and Worked on integrating data from flat files and spreadsheets.

Desired Candidate Profile:

  • Ability to solve problems and identify opportunities through analytical thinking.
  • Good knowledge of Advanced MS Excel tools is a must for reporting and analysis.
  • Strong attention to detail and has ability to learn quickly.
  • Excellent communication skills with good command over English language (verbal & written).
  • Ability to work independently and in a team environment.
View

ETL Tester / Validation

Skills Required:

  • Hands-on knowledge on complex SQL queries.
  • Hands-on Knowledge on Unix commands.
  • Sound Knowledge on Database concepts.
  • Hands-on knowledge Ab Initio.
  • Hands-on knowledge on HP ALM.
  • Sound Knowledge on STLC concepts.
  • Sound Knowledge on ETL Testing concepts.
  • Sound Knowledge on Agile Methodology.

 

View

Big Data Developer/ Sr Developer

Role: Big Data Lead
We are looking for an Engineering and Solution Lead to contribute technically for the design and
development of advanced analytical platform leveraging Big Data technologies.

Resposibilities
• Lead a development team of data engineers
• Implement a big data enterprise data lake, BI and analytical system using Hive LLAP, Spark,
Kafka, Hive, Hbase
• Responsible for design, development, testing oversight and implementation
• Works closely with program manager, scrum master, and architects to convey technical impacts
to development timeline and risks
• Coordinate with data engineers and API developers to drive program delivery.
• Drive technical development and application standards across enterprise data lake & Retail Data
domain
• Benchmark and debug critical issues with algorithms and software as they arise.
• Lead and assist with the technical design and implementation of the Big Data cluster in various
environments.
• Guide/mentor development team for example to create custom common utilities/libraries that can
be reused in multiple big data development efforts.
• Accomplish results by communicating expectations; planning, monitoring, and appraising results;
coaching, counselling, and disciplining employees; developing, coordinating, and enforcing
systems, policies, procedures, and productivity standards.

• Must have a bachelor’s degree in Computer Science, Engineering, or a related field
• 10+ consecutive years in the software engineering profession
• 4+ years of hands-on experience in design and development of large scale Big Data applications
in platforms such as Hortonworks, Cloudera
• Extensive experience in Java, Spark, Hive, Kafka, Scala, Python technologies
• Must have experience with design, development and deployment in the DevOps environment.
• Should have led on the design and implementation of an E2E Data warehousing implementation
including source system integration and data lake implementation in a Big Data environment
• Must be willing to work in a fast-paced environment with an on shore – offshore distributed Agile
teams.
• Must have the ability to develop and maintain strong collaborative relationships at all levels
across IT and Business Stakeholders

• Must have the ability to prioritize multiple tasks and deal with urgent requests in a fast-paced and
demanding environment.
• Excellent written and oral communication skills. Adept and presenting complex topics, influencing
and executing with timely / actionable follow-through
• Good understanding of FSLDM or equivalent financial/Banking Datawarehouse models

Desired Characteristics:
• Ability to provide innovative ideas and see through implementation
Extensive experience working with data warehouses and big data platforms
• Demonstrated experience building strong relationships with senior leaders
• Outstanding written and verbal skills and the ability to influence and motivate teams
• Banking domain knowledge will be preferable.
• Have done Customer Facing role in earlier engagement

View