Please refer to JobSuchmaschine in your application
For one of our great Clients we are looking for a Senior Big Data Architect in Zürich
this is a fantastic opportunity and i would love to discuss the opportunity further and of course whats in for you, i will be happy to get in touch and introduce you - Please apply to Michaela. firstname.lastname@example.org
What Candidate we need
• Extremely well-versed in Big Data architecture and technologies
• Experience on Cloudera or any other similar Hadoop platform
• Should have worked in the capacity of a Big Data architect in prior roles and programs
• Should have led Big Data implementations from inception to eventual delivery
• Thorough understanding of the value proposition of Big Data in a highly complex data environment with the ability to comprehend and capture other critical components, when it comes to an end-to-end
The role requires working closely with the client stakeholders (IT, Architecture & Business) in formulating the architecture, strategy and roadmap on how the Big Data platform can be leveraged and extended to form a backbone to address and service the end-client facing solutions, at the same time becoming a data store and repository for consumption for regulatory and risk reporting needs, analytics and driving process automation.
Architect and roadmap an end-to-end solution for a business transaction store, extending the Cloudera Big Data stack, at the same time considering & utilizing other components and platforms available within the client landscape.
Design and architect critical data and architecture components
Create point of views and drive pilots for overall architectural solution options being considered
Estimate and size the Big Data solution to meet performance, security, real-time and other business needs
Ensure seamless integration of Cloudera Hadoop Platform into the existing landscape and architect and design the corresponding integration technologies.
Support cross functional teams to drive Hadoop implementation based upon customer’s business requirements.
Design data transformation and aggregation solutions utilizing MapReduce, PIG, Hive and including loading and transforming large sets of Structured, unstructured data.
Assist in development, designing and implementing solutions using Big Data technologies and implementation methodologies (SCRUM)
Excellent knowledge of Cloudera Hadoop Platform
architected solutions on Hadoop, utilizing and extending various components like Kafka, Spark, HBase, HDFS, Flume, Storm, MapReduce, Pig, Scoop, MongoDB, Scala, Platfora, MDM methodology driven by needs
Big Data platform estimation and sizing experience in large-scale implementations
understanding and experience in one or more NoSQL databases like MarkLogic, Mongo DB, Cassandra, etc.
Nice to Have Skills.
German language (desirable)
English at advance business and technological level
Financial Industry and IT business models know-how.
Certifications and Accreditations on Big Data and Associated technologies (desirable)