Browse Jobs posted by genuine Employees
Apply to get a referral. Absolutely FREE
Deloitte logo

Deloitte hiring for Consultant/Senior Consultant-Information Management

  Deloitte      Hyderabad, Bengaluru, Mumbai      2 Years
BackendBI/AnalystData Science/ML
Vivek Verma
Employee
391
Views
4
Quick Applied
( 4 Quick Applied )

Job Description

Hi Job Seeker,

Informatica:
 
• Should be well versed with Data-warehousing concepts
• Should have development experience using Informatica Power Center
• Should have worked in atleast one RDBMS like Oracle, Teradata, SQL Server, DB2.
• Shell scripting in Linux/Unix environment is mandatory
• Performance Improvement skills on Informatica / RDBMS is desirable.
• Should have participated in different kinds of testing like Unit Testing, System Testing, User Acceptance Testing.
 
HFM:
 
·Should have done at least 2 to 3 implementations in Hyperion EPM Products
•Knowledge of financial consolidation and reporting
•Strong Communication and repo building skills with team. Should have experience to work in onshore/offshore delivery model
•Should have strong problem solving and analytical capabilities
•Should be self-starter in solution implementation with inputs from design documents
 
Datastage:
 
•Should have strong problem solving and analytical capabilities.
•Must be well versed with Data-warehousing concepts, be able to understand ER and Dimensional models and been part of at least one end-to-end implementation of Data-warehouse/BI solutions
•Should have development experience using IBM Infosphere Datastage and worked in at least one RDBMS like Oracle, Teradata, SQL Server, DB2
•Shell scripting in Linux/Unix/Windows environment is required
•Performance Improvement skills on Datastage / RDBMS is desirable.
 
Data Analytics:
 
The key job responsibilities include the following
The Data Science Engineer shall be responsible for Designing and developing ‘Actionable Information Insights’ solutions which will help businesses take informed decisions. The person will be involved in data modeling and analysis of big data for mining the useful patterns relevant for business processes. The person should be able to acquire data, understand it, visualize it, process it using advance data mining algorithms, extract value from it and communicate it effectively.
• Derive insights out of ambiguity - understand, process and interpret complex data
•  Analyze complex business data and identify patterns in the data using algorithms from statistics and machine learning
 
 
Big Data Hadoop:
The key skills required: 
  1+ years of hands-on experience using Hadoop (preferably Hadoop 2 with YARN), MapReduce, Pig,        Hive, Sqoop, and HBase
• Strong understanding of Hadoop ecosystem including setting up Hadoop cluster with knowledge on cluster sizing, monitoring, storage design and encryption at rest and motion.
•  Experience in scheduling Hadoop jobs using oozie workflow and Falcon
•  Proven experience in implementing Security in Hadoop ecosystem using Kerberos, Sentry, Knox and Ranger.
 
IDQ:
 The key skills required
•  Informatica Transformation knowledge
•  MUST (Expression, Source Qualifier, Router, Filter), File / SQL as both Source(s) / Target(s)
•  OPTIONAL (SQL Transformation, Java Transformation)
•  Data Profiling:
•  Standard / Best Practices
•  Column Profiling (Base Column, Sub Level Column, etc...)
•  Cross Join Profiling (Joined Tables Profiling, Integrity Profiling, etc...)
•  Create Custom Profiles
•  Profiling Summary Reports and Scorecards
•  Data Quality Standard / Best Practices
•  Create / Execute Expression Rules
•  Match / Merge Processes
 
Informatica MDM:
  The key skills required
•  Good knowledge of Informatica MDM.
•  Good knowledge of IDD, HM, Match/Merge and SIF.
•  Should have experience in integrating data quality tools with Informatica MDM.
•  Should have experience in integrating Informatica MDM with downstream and upstream applications through a batch/real-time interface.
•  Should have experience working in at least three end-to-end implementation of Informatica MDM solutions.
•  Should have experience in fine tuning match/merge process and troubleshooting performance issues in Informatica MDM.
 
IBM MDM:
  The key skills required
•  Proficient in customization and configuration of IBM MDM Infosphere Server.
•  Should be able to do complex design and configurations
•  Should be able to work independently and lead a small-medium size team
•  Should be proficient in Java/J2EEE/Web services and be able to integrate the Infosphere MDM with other applications in the landscape
•  Technical Design, and implement IBM Infosphere Engagements
•   Provide standards, guidelines, processes and expertise to consistently address enterprise MDM issues such as convergence, standards, and synchronization
 
Emerging Technologies:-Reltio, Collibra, Talend, Tam
  
 
Talend:
The key skills required
•   3-7 Years of technology Consulting experience
• A minimum of 1 Year of experience in designing and developing job in Talend Data Integration (DI)  and Data Quality (DQ)
• A minimum of 3 years of experience in designing and developing data integration jobs using any other tools such as Informatica, Data Stage, Ab Initio, Oracle Data Integrator etc.
• Strong foundation in Data Warehouse concepts, relational databases and data modeling in a RDBMS
• Generate native code using Talend Data Quality (DQ)
• Ability to translate business requirements and technical requirements into technical design
• Good knowledge of end to end project delivery methodology implementing ETL
• Strong UNIX operating system concepts and shell scripting knowledge
• Ability to operate independently with clear focus on schedule and outcomes
 
 BODS: 
 • Total Exp. 4 yrs- 8yrs , Over 2 years of experience in SAP BODS Suite and its components.
•  Worked on at least one end to end implementations as part of a Pl sql, SAP data conversion/data quality project or a data warehouse implementation project
• Good hands on experience in Data Quality components of BODS including address cleansing and duplicate matching functionalities
• Functional and technical understanding of SAP transaction systems (SAP ECC, SAP CRM, etc.) and SAP BW will be an added advantage
• Good overall understanding of Enterprise Data Management components such as Master Data Management, Data Governance, and so on will be an added advantage
 
 
Power BI:
  The key skills required
•  Hands-on professional with thorough knowledge of scripting, data source integration and advanced GUI development in Power BI
•  Experience in creating Power BI reports that represent different findings and insights from the data using interactive charts and maps
•  Prior experience of working with Custom visualizations
•  Familiarity with custom formatting
•  Using Power Bi filters and conditions at dataset, report and dashboard levels
•  Creation of advanced visualization like heat maps
•  Creating aesthetically appealing dashboards using Power BI reports
•  Connect to multiple datasets to bring all of the relevant data together in one place and create reports that provide a consolidated view of the different sources
 
Mulesoft:
 
• At least 2+ years of experience as Integration Developer using any ESB
• 1+ years of experience in MuleSoft
• Experience using Anypoint Studio.
• Experience on MuleSoft versions 3.5 and above.
• Good Understanding of Database concepts.
• Experience working on MuleSoft for services and micro-services
• Experience working on MuleSoft for Orchestration and Integration
• Experience in defining business process and data models (UML, BPMN).
• Experience in working on flows, sub-flows, connectors, flow controls, filters, REST/SOAP web services, API Design and Development using RAML, MUnits/JUnits, Data Transformation using Weave & Mapper and Exception Handling Strategies.
•  Experience as an object orientated software engineer.
•  Extensive Experience in a Scrum/Agile SDLC Environment.
•  Experience with multiple commercial integration systems a plus (Tibco, Biztalk, IBM)
 
Azure:
• Experience on defining architecture large solutions using .NET/Java technologies
• Experience in defining and implementing cloud/Azure based large solutions.
• Working experience in Azure IaaS, PaaS, storage, network and database.
• Experience and understanding of security requirements for cloud.
• Experience in defining highly available, DR solutions in Azure
•  Experience in migrating large on premise workloads (windows and non-windows) to Azure.
 
Tableau:
The key skills required
 
• Develop advanced Analytics Descriptive and Predictive visualization Dashboards in the above areas.
• Analyzing business specifications and creating diagnostic, descriptive reports
• Designing Dashboards and Scorecards taking into account aspects including horizontal / vertical cascading, drill-downs / drill-through etc.
• Self-Service capabilities through the platform as a service (PaaS) for self-served customized reports and ad-hoc report creating capability
• Developing visualization products that present information that is easily understood by all users, using Qlikview, Tableau, or Spotfire
• Optimizing presentation and visualization of results
 
Pentaho:
The key skills required
 
• Should have expertise in ETL tools like Pentaho Spoon IDE and Pentaho BI Suite of products, RDBMS like RDBMS (at least one) like PostgreSQL, Oracle, Teradata, SQL Server, DB2 environment
• Should have 3+ Yrs. of experience in Pentaho ETL with at least 3 to 4 Implementation on Pentaho ETL.
• Should have experience in Structured Query Language (SQL) and Procedural SQL.
• Excellent analytical, problem solving, and troubleshooting skills to manage complex process and technology issue
• Should have considerable experience in Testing which activities like creating system / integration / performance test plan, performing system / integration / performance testing, defect tracking, root cause analysis of defects, support user acceptance testing (UAT) and facilitate UAT sign off
 
   
Information Analyzer:
 
Should have strong problem solving and analytical capabilities
Should be well-versed with Data Quality Analysis techniques and concepts
Should have development experience using IBM Infosphere Information Analyzer Data Quality and Data Stage ConsoleShould have worked in at least one RDBMS like Oracle, Teradata, SQL Server, DB2.
Shell scripting in Linux/Unix environment is mandatory
Performance Improvement skills on IBM Analyzer is desirable
Should have participated in different kinds of testing like Unit Testing, System Testing, User Acceptance Testing
Should have participated in at-least 1 end-to-end implementation of Data-Quality solutions
Should be able to understand both ER and Dimensional models
Should be well versed with understanding of design documents like HLD, LLD etc.
Should be self-starter in solution implementation with inputs from design documents
Preferable to have experience to automation and scheduling tools
Candidate should possess good communication skills. Should have experience to work in onshore/offshore delivery model.
 
Informatica BDE:
Minimum 3 to 4 years of hands on experience on ETL tools preferably Informatica.Good to have: At least 6 month to 1 year of experience developing mappings using Informatica BDE.Good knowledge of Big data technologies specially hdfs,map reduce,hive/hive QL preferably working experience.Strong SQL/unix experience.
Strong logical , Analytical and Communication skills.Should be Self-motivated and a quick learner,As a Informatica BDE developer you will be expected to develop/unit test simple to complex Informatica BDE mappings as per Technical Design specifications provided to you.
 
 
Qualifications
Location: Hyderabad. Bangalore , Mumbai & Gurgaon
Good communication Skills
Over all 4+ years of relevant experience
Education: B.E./B.Tech./MCA/MSC/MTech/BSC/MBA.  

 
 
Qualifications
Location: Hyderabad. Bangalore , Mumbai & Gurgaon
Good communication Skills
Over all 4+ years of relevant experience
 
Interested ones can send their Resume to my emial Id:- : Click to see email-id
NOTE:- Ones who have appeared in the past 6 months for interview process or already applied through some other source(s) then KINDLY DO NOT send your Resume.
Prerna just got her resume reviewed and career guidance from a Principal Engineering Manager at Microsoft! See how.



Get Job Alerts!
Get notified only for high paying relevant jobs in top MNCs. Promise!