Specialist Technology - Big Data, Hadoop, Scala, Kafka
Job Summary:
Big Data Architect will work within the Capital Markets / Commodities / Financial space to Architect/Develop/Implement high end software products/services involving large scale data processing - data ingestion, in-stream analytics and batch analytics. Opportunity to work in big data platforms like Spark, Hadoop and HBase is a salient feature about this role.
Primary Responsibilities:
- Develops reusable frameworks/components and POCs to accelerate development on projects
- Owns consistency and high quality in solution delivery
- Sets up the development and production environments and troubleshoots performance issues
- Participates in architecture and design reviews for projects that require complex technical solutions
- For package implementations, performs the gap analysis between business requirements and the package features and design the configuration, customizations, extensions, interfaces required to meet the requirements
- Represents the organization in customer-facing communication pertinent to Sapient's technical expertise on the specific platform
- Actively identifies areas of focus and gain expertise through various activities like POC, research etc
- Develops and promote architectural best practices and standards
Experience Guidelines:
- 8 to 12 years of experience with strong hands-on knowledge in Java and Scala development
- More than 2+ years of experience in Big Data application development involving various data processing techniques - Data Ingestion, In-Stream data processing, Batch Analytics
- Excellent knowledge, experience with the Hadoop stack (Hadoop, Spark, Spark Streaming, Hbase, Sqoop, Flume, Shark, Oozie, etc.).
- Solid exposure to Core Java and distributed computing
- Good understanding of NoSQL platforms like MongoDB & Cassandra
- Expertise in Business Requirement gathering, Analysis & conceptualizing high-level architectural framework & Design
- Must have experience in designing and architecting large scale distributed applications
- Thorough understanding of Solution architecting & Estimation process
- Excellent consulting skills, oral and written communication, presentation, and analytical skills
- Excellent knowledge of relational database technologies with experience on databases such as Oracle or SQL Server.
- Active involvement in thought leadership Best Practices development, Written White Papers and/or POVs on Big Data Technology
- Self-starter, with a keen interest to grow in Analytics space
ADDITIONAL/OPTIONAL SKILLS
- Knowledge of traditional Data Warehousing architecture, design principles
- Knowledge of any one of BI tools such as Cognos, BO, Tableau, Qlikview
- Knowledge of any one of ETL tool such as Informatica, Datastage, Talend etc
- Knowledge of data mining and machine learning algorithms used in analytics and tools like R, Python, Weka
Education:
Full Time Bachelor's / Master's degree (Science or Engineering preferred)
About Global Markets
Sapient Global Markets, a part of Publicis.Sapient, is a leading provider of services to today's evolving financial and commodity markets. We provide a full range of capabilities to help our clients grow and enhance their businesses, create robust and transparent infrastructure, manage operating costs, and foster innovation throughout their organizations. We offer services across Advisory, Analytics, Technology, and Process, as well as unique methodologies in program management, technology development, and process outsourcing. Sapient Global Markets operates in key financial and commodity centers worldwide, including Boston, Calgary, Chicago, Dusseldorf, Frankfurt, Houston, London, Los Angeles, Milan, New York, Singapore, Washington D.C. and Zurich, as well as in large technology development and operations outsourcing centers in Bangalore, Delhi, and Noida, India. For more information, visit www.sapientglobalmarkets.com.
Didn’t find the job appropriate? Report this Job