Functie / Function:
Do you want to bring banking as we know it to the next level and contribute to the biggest digital revolution in Dutch banking history? Do you have affinity with data and the data integration and do you agree with us that data is at the core of the value we can bring to our customers? Our client is reinventing the data landscape and work on the bank’s data marketplace. This is a seamless and single portal experience, where you provide securely and use quickly, the data needed. Are you passionate about designing and building it? With our client, you can put your passion into action! They are looking for a Big Data engineer who has proven experience in designing, building and maintaining big data platforms on Hadoop and azure. Sounds good? Right!
To speed up and strengthen the digital & data transformation our client is searching for the best big data engineer to help them. With helping they mean that you:
• Understand the data technologies of today and tomorrow and know how to implement them in effective way;
• Must innovate, experiment and advice on applying new technologies;
• Are a team player willing to be part of a team with engineers;
• Continuously improve software engineering practices;
• Work in DevOps teams and have an agile mindset.
As a big data Developer you will be a key contributor to their Services practice and will have the below responsibilities:
• Work with product owners to understand desired application capabilities and testing scenarios;
• Work within and across Agile teams to design, develop, test, implement, and support technical solutions across a full-stack of development tools and technologies;
• Work on a variety of development projects on a big data platform;
• Work with business stakeholders and other SMEs to understand high-level business requirements;
• Work with the Solution engineers and contribute to the development of roadmaps;
• Adhere to existing processes/standards including the development lifecycle, business technology architecture, risk and production capacity guidelines;
• By creating reusable Python code;
• By using the right technologies which fit for purpose.
Examples of the epics/episodes you could work on:
• Data streaming solutions (Kafka);
• Big data platform Hadoop;
• Big data platform on cloud (Microsoft);
• Remain compliant with regulators (such as GDPR);
• Security framework;
• Meta data framework • Enabling of self-service BI tooling;
• Aware of the latest trends in data and bring smart ideas to the table.
Your working environment
With a combination of goodies straight from the market leaders (such as Kafka confluent, Databricks, PowerBI, Cosmos, Cloudera/Horton works) and self-build tooling CADM Data & Analytics is the data heart of our client. In this team it is acknowledged that the status quo needs to be broken and that radical modernisation of the way they treat data is necessary. Therefore, the possibilities are vast. Building software is great but creating value out of seven petabytes of data is better! While this means we value our top-notch quality standard, you will be surprised by the informal atmosphere in which we do this. You are part of a (scrum)team and you will guide and assist the various agile scrum teams by translating Enterprise architectures into solutions using the latest technologies.
Gewenst / Desirable:
As you can see, they have certain expectations. Ideally, you already fit into them. If you do not, but feel you can grow into such an expert, they are also very interested to meet you!
• You can tell your colleagues when you should go for a traditional RDBMS or when it is better to move to a public cloud solution that fits better the performance needs. In other words: you know what you're talking about;
• You not only understand the technical part, but you are also aware that to create value out of vast amounts of data, you need to implement tools to analyze this, with all its aspects;
• You have proven experience in the big data world on prem and/or the cloud;
• You have an exploratory and eager mindset, on the lookout for new developments, but you understand that for sustainability adopting certain standards are required;
• A team player, willing to help others and share knowledge whenever you think you can;
• 5+ years of experience with Big Data tools and technologies including working in a Production environment of a Hadoop Project;
• 3+ years of experience with SQL, Hive, Impala, Oozie, HDFS, Hue, Git, MapReduce and Sqoop;
• 2+ years of Programming experience in Python or any Object Oriented Programming;
• Analytical and problem solving skills, applied to a Big Data environment;
• Experience with large-scale distributed applications;
• Experience with Agile methodologies to iterate quickly on product changes, developing user stories and working through backlog;
• Ability to learn and apply new concepts quickly;
• Capable and eager to work under minimal direction in fast-paced energetic environment, managing multiple projects and priorities at once.
• Preference to see an education in Computer Science, Information Science, Econometrics or Mathematics;
• Skills in Data tooling and Data related Programming Languages are (depending on the level of application) either a big advantage to understand and advise the development teams you support. Please find examples in the previous section.
• A good salary;
• Substantial education possibilities and a public transportation card;
• A large responsibility, and the freedom to do your job;
• A well-equipped office, and all required facilities to do your job;
• Freedom to use various cloud services like Amazon Web Services or Azure;
• Opportunities to visit (tech-related) seminars;
• Required investments to boost your personal skills;
• A client base of +/- 5mio users for you to make an impact on;
• Informal atmosphere with plenty of room for fun and learning by doing.