SC Cleared – Senior Data Engineer
Location – Remote
Salary – £70 – £80K per annum + Bonus
Benefits – Bonus, flexible working hours, career opportunities, private medical, excellent pension, and social benefits
NOTE: Please note this role requires existing SC Clearance.
The Client:
Curo are collaborating with a global edge-to-cloud company advancing the way people live and work. They help companies connect, protect, analyse, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today’s complex world.
The Candidate:
This is a fantastic opportunity for a bright, driven, and customer focused Senior Data Engineer. The ideal candidate will have any / all of the following: RHCE certification, Bash or Python scripting, Automation using Ansible, Kubernetes administration, basic knowledge of Spark.
The Role:
As a Senior Data Engineer you will be responsible for delivering a variety of engineering services to customers world-wide. Assignments will vary based on the successful candidate’s skills and experience. Typical assignments may involve cluster installation, ETL, solution design and application development, platform services. The growing client base is made up of Fortune 50 companies, and the assignments will be challenging, and immensely rewarding. Reporting into the Global Delivery Manager, this role offers the opportunity to learn and apply big data technologies and solve related complex problems.
Duties:
- Mastering the Data Fabric and Container Platform, including MapR-FS, MapR-DB Binary and JSON Tables, MapR-Streams, Kubernetes and Eco-System products, maintaining proficiency and currency as the technology evolves and advances.
- Achieving proficiency with cluster and framework sizing, installation, debugging, performance optimization, migration, security and automation.
- Working effectively with MapR DB Binary and Json Tables sizing performance tuning and multi-master replication.
- Event Stream sizing, performance tuning and multi-master replication.
- Ensure Professional Service engagements are delivered to the highest standards on time and budget.
- Acting as a technical interface between the customer (Data Science/Data Analysts) and the delivery team, point of escalation between Customer and Product Engineering.
- Providing best practice in exploiting the software to meet the Customer Use Cases.
- Providing technical thought-leadership and advisory on technologies and processes at the core of the data domain, as well as data domain adjacent technologies.
- Engaging and collaborating with both internal and external teams and be a confident participant as well as a leader
- Assisting with solution improvement.
- Being a technical voice to customers and the community via blogs, User Groups (UG’s) and participation at leading industry conferences.
- Staying current in best practices, tools, and applications used in the delivery of professional service engagements.
Requirements:
- 5+ years of experience administering any flavour of Linux
- 3+ years of experience in architecting, administrating, configuring, installation and maintenance of Open Source Big-data applications, with focused experience on MapR distribution
- 3+ years hands-on experience with supporting Hadoop ecosystem technologies
- Source Big-data applications, with focused experience on MapR or CDP distribution.
- Expertise in administration of MapR DB / Hive / HBase / Spark / Oozie / Kafka.
- Strong scripting skills (Bash or Python preferred).
- Familiarity with commercial IT infrastructures including storage, networking, security, virtualization, and systems management.
- Good understanding of High Availability.
- Able to implement Hadoop Data Security using Groups/Roles.
- Ability to implement and manage Cluster security.
- Ability to troubleshoot problems and quickly resolve issues.
- Cluster maintenance as well as creation and removal of nodes.
- Performance tuning of Hadoop clusters and Hadoop MapReduce routines.
- Screen cluster job performances and capacity planning.
- Monitor cluster connectivity and security.
- Collaborating with application teams to install operating system and MapR updates, patches, version upgrades when required.
- Integration with other Hadoop platforms.
- Familiarity with either Ansible, Puppet or Chef.
- Familiarity with Kubernetes.
- Proficiency in basic Java or Scala programming (preferred but not required).
- Bachelor’s degree in CS or equivalent experience.
- Strong verbal and written communication skills are required.
To apply for this SC Cleared – Senior Data Engineer job, please click the button below and submit your latest CV.
Curo Services endeavour to respond to all applications. However, this may not always be possible during periods of high volume. Thank you for your patience.
Curo Resourcing Ltd acts as an Employment Business for contract and temporary recruitment as well as an Employment Agency in relation to permanent vacancies.
Job Reference: RL6770
Current Vacancies
IT Support Engineer – Part Time
Pay: £230-£250
Duration: 6 Months
Ref No: RL6818
Field Services Cover Engineer
Pay: £30,000 - £35,000
Duration: Permanent
Ref No: RL6795
Onsite Support Engineer
Pay: £25,000-30,000
Duration: Permanent
Ref No: RL6788
HPC Technology Consultant
Pay: £70,000 - £80,000
Duration: Permanent
Ref No: RL6783