Capgemini’s Cloud Infrastructure Services unit is a global team of technology experts and domain specialists that helps businesses from all around the world extract maximum business value from their IT investment and facilitates their journeys to the cloud.
In addition to facilitating journeys to the cloud, we support businesses by providing services such as helpdesk services (human and AI powered), network access and maintenance, hosting and maintaining applications, IT operations and much more. We do all this in 24 languages from 4 locations in Poland.
Join our global team and be part of technology transformation.
Who are we looking for?
- Must have a good Linux experience and understanding (preferably SUSE Linux)
- Good troubleshooting skills (especially performance analysis)
- Ability to work in international team
- A great sense of humor
Nice to have:
- Previous experience with any Hadoop distribution (MapR or Cloudera preferred)
- Perfect understanding of networking and storage concepts used in Hadoop ecosystem
- Knowledge of Big Data Cloud services (AWS Kinesis, AWS Elastic MapReduce, Azure Data Lake, Azure Databricks, Azure HDInsight)
- Experience with scripting would be appreciated ( bash, python , PowerShell )
- Good interpersonal skills
What will you do?
We are looking for a professional experienced with Linux ecosystem who wants to join our team to help us in taking our Hadoop installations to the next level of maturity. You will participate in DevOps meetings and work together with other experts to improve supported platforms. You will have the chance to work with data scientist and architects from automotive industry and build next generation ADAS Big Data solutions. You should be motivated, have a “can do” attitude and be willing to keep on developing your skills (no routine).
- Participate in platform meetings with the customer and Capgemini architects
- Provide guidance to operations team in regards to all Hadoop related aspects
- Be accountable for clusters performance analysis and problems remediation
- Take care of Hadoop clusters security and compliance
- Automate manual tasks to decrease level of human interaction in the cluster
- Take part in cloud migration strategy creation
- Cooperate with other teams including external suppliers
- Develop and maintain of existing and new documentation
Who will you be working with?
Data Lake teams are focused on developing and maintaining various Hadoop platforms. We work in a dynamic environment providing delivery and project services to maintain the infrastructure, improve current standards and implement new solutions.
What we offer?
- Working with great people and in legendary atmosphere
- No formal dress code
- Annual family picnics
- Unforgettable integrational events
- Employee volunteering opportunities and interesting CSR projects
- We value and respect diversity in terms of gender, nationality, roles, age, interests
- Internal celebration initiatives: Children’s Day, St. Nicholas Day and many more
- Supporting employees’ hobbies: Business Run, e-sport games, basketball, volleyball
- Development in expert or leader competencies
- Broad training offer with possible co-funding
- Access to Harvard Business Review knowledge base
- Introduction plan for new employees and Buddy Initiative
- A wide range of instructor-led and e-learning trainings
- Co-financing for post-graduate studies and courses
- Many companies under one roof / internal headhunters
- Internal development events: conferences, meetings, communities
- Education First platform for learning English online
- Bonuses, including those for new employees recommendation
- Additional life insurance
- Attractive package of extra benefits of your choosing (fitness, gym, cinema, etc.)
- Disability inclusion, assistive technologies, reasonable accommodations
- Private medical care for you and your family
- Bicycle parking and carpooling options
- Free coffee, water, milk and wide range of teas
- Anti-smog plants in offices
- Car leasing