Role Big Data Architect / Senior Architect
Work Location Bangalore
Experience 13+ years SBU Business Information Management (BIM)
Responsibilities Capgemini’s Business Information Management (BIM) initiative is a strategic focus for the company globally. BIM is a multi-disciplinary group with focus on four key aspects - Data, Analytics, Application and Cloud – and Big Data is at the center of the strategy.
The Big Data Architect would play a key role in the following areas of the Information fabric, as it relates to the development of next-generation business technology assets.
- Drive client engagements focused on big data and advanced analytics, in diverse domains such as telco customer experience, retail price optimization, channel management, marketing strategies, smart grids, etc
- Lead architecture and solutioning efforts to define and develop big data components, that will interoperate seamlessly with other elements of the broader information architecture
- Clearly understand and articulate the detailed differences between various Big Data technologies (Hadoop, NoSQL, etc) and products (e.g., Cloudera, Hortonworks, IBM Biginsights, Pivotal, etc)
- Analyse data growth and lead capacity/sizing activities to arrive at the most appropriate commercial and technical solution
- Leverage data visualization techniques and tools to effectively demonstrate patterns, outliers and exceptional conditions in the data
- Work collaboratively with other members of the data science and information architecture teams to innovate and create compelling data-based stories and experiences
- Develop a flair for the exploratory and experimental side of the role; required to bring out interesting and previously unknown insights in vast pools of data
- Work closely with the extended BIM team (Data, Analytics, Cloud) to support various projects by developing strong engineering and custom application modules
- Assume ownership of deliverables and ensure these are completed within set deadlines
- Understand and articulate the role of various intersecting technologies like Cloud, Big Data, Social and Analytics, in the enterprise Specific Competencies for the role Technical – Required Skills
- Graduate / Post Graduate from premier engineering colleges like IITs, BITs, NITs, etc
- 12+ years experience in the design, development, and deployment of Information Management / Business Intelligence applications
- 2+ years of experience working with the Hadoop framework, including the various components like HDFS, HIVE and the Map/Reduce programming framework
- At least one year of hands-on experience in one big data commercial product/distribution like Cloudera, Hortonworks, IBM Infosphere BigInsights, Pivotal, CouchDB, MongoDB, etc
- Strong understanding of Database technologies, and prior experience working with at least one non-relational DB - either MPP (e.g., Greenplum, Netezza, Exadata) or In-memory (e.g., HANA) database
- Exposure to Pivotal products like SQLFire, GemFire, Pivotal HD, etc
- Experience with building and operating scalable applications leveraging distributed computing principles
- Exposure to programming languages supported by a big data platform, like Java, Python, C++, or ECL, etc
- Good exposure to web service technologies (REST, SOAP, XML, JSON, etc) Technical
Desired Skills
- Experience with traditional Business Intelligence and Integration tools like SAP Business Objects, OBIEE, Informatica, etc
- Experience with J2EE technologies (Servlets, JMS, JAXB) and utilizing MVC frameworks (Spring, Struts2, Apache Axis, Hibernate, etc)
- Experience of working in an Agile environment
- Familiar with Ruby on Rails/ JavaScript/ Objective- C / HTML5 Soft Skills
- Strong analytical, quantitative and problem solving skills
Didn’t find the job appropriate? Report this Job