Wednesday, June 27, 2018

CGI hiring for Software Engineer (Hadoop) in Hyderabad

Company : CGI

Website :

Job Role : Software Engineer - Hadoop

Eligibility : Any Graduate

Experience : Freshers/Exp

Job Location :Hyderabad

Company Profile:

Founded in 1976, CGI is a global IT and business process services provider delivering high-quality business consulting, systems integration and outsourcing services. With 68,000 professionals in 40 countries, CGI has an industry-leading track record of on-time, on-budget projects, aligning our teams with clients’ business strategies to achieve top-to-bottom line results. Our client proximity operating model fosters local accountability for client satisfaction and success. With global delivery capabilities in centers located across four continents, we offer our clients the best value, advantages of scale and operational efficiencies, as well as the ability to reduce their time to market.

Job Description:

First level Applications Development professional, representing the most common entry point into the organization.
Performs routine activities related to applications development.
Focuses on learning and acquiring work skills/knowledge in the Applications Development field.

Job Responsibilities:

Understanding of database internals and data warehouse implementation techniques; working knowledge of SQL.
Knowledge of data structures and algorithms.
Solid understanding of data structures & common algorithms Understanding of time-complexity of algorithms.
Implementation and design of distributed systems.
Programming and language skills.
Understanding and experience with UNIX and Shell scripting.
System design and implementation experience.
Familiar with fault tolerance system design and high performance engineering.
Fundamental concepts of scheduling, synchronization, IPC and memory management. Familiarity with information retrieval techniques is preferred.
Able to work well in extremely fast paced and entrepreneurial environments Independent self-starter who is highly focused and results oriented.
Strong analytical and problem solving skills.
Design & implement work flow jobs using Talents & UNIX / Linux scripting to perform ETL on Hadoop platform.
Translate complex functional and technical requirements into detailed design.
Perform analysis of data sets and uncover insights.
Maintain security and data privacy.
Propose best practices/standards.
Work with the System Analyst and development manager on a day-to-day basis.
Work with other team members to accomplish key development tasks.
Work with service delivery (support) team on transition and stabilization.


Hadoop Hive.

How to Apply : Apply Here.

No comments:

Post a Comment