Advanced Developer, Big Data

Numéro d’emploi: 12106
Emplacement(s): Minneapolis, MN

Aperçu et Responsabilités

POSITION OVERVIEW

Technology at General Mills accelerates process transformation and business growth around the globe. To achieve business success, the Global Business Solutions team uses leading edge technology, innovative thinking and agile processes. One of our General Mills’ key technology priorities is driving business action through connected data.


The General Mills Enterprise Data and Development Services organization is currently in the process of building a big data platform based on Cloudera Hadoop as part of a multi-year strategic Data Lake program, to advance the data-driven decision making capabilities of our enterprise. If you are an agile learner, have strong problem solving skills and are able to function as part of a highly technical, cross functional team, we would like to hear from you.

General Mills is seeking a Big Data Developer with Hadoop/Oracle experience to act as a key technical leader in our organization.

RESPONSIBILITIES

  • Act as a key technical leader within General Mills
  • Design, create, code, and support a variety of Hadoop ETL solutions (potentially including but not limited to: Python, Scala, Kafka, SAP Data Services, or others)
  • Generate and implement your own ideas on how to improve the operational and strategic health of big data ecosystem
  • Participate in the evaluation, implementation and deployment of emerging tools & process in the big data space.
  • Partner with business analysts and solutions architects to deliver business initiatives.
  • Collaboratively troubleshoot technical and performance issues in the big data ecosystem
Qualifications

MINIMUM QUALIFICATIONS

  • Bachelor’s Degree required
  • Minimum 1 year of IT experience required
  • Strong understanding of Hadoop fundamentals
  • Database development experience using Oracle, SQL Server, SAP BW or SAP HANA
  • Job Scheduling experience
  • Process mindset with experience creating, documenting and implementing standard processes
  • Big Data Development experience using Hive and/or Spark
  • Effective verbal and written communication and influencing skills.
  • Effective analytical and technical skills
  • Ability to work in a team environment
  • Ability to research, plan, organize, lead, and implement new processes or technology

PREFERRED SKILLS/EXPERIENCE

  • Bachelor’s Degree in Computer Science, MIS, or Engineering preferred
  • 2+ years of IT experience preferred
  • Python, Scala or Java development experience
  • Familiarity with Kafka 
  • Familiarity with the Linux operating system
  • Experience with agile techniques and methods

#CB

Aperçu et Responsabilités

POSITION OVERVIEW

Technology at General Mills accelerates process transformation and business growth around the globe. To achieve business success, the Global Business Solutions team uses leading edge technology, innovative thinking and agile processes. One of our General Mills’ key technology priorities is driving business action through connected data.


The General Mills Enterprise Data and Development Services organization is currently in the process of building a big data platform based on Cloudera Hadoop as part of a multi-year strategic Data Lake program, to advance the data-driven decision making capabilities of our enterprise. If you are an agile learner, have strong problem solving skills and are able to function as part of a highly technical, cross functional team, we would like to hear from you.

General Mills is seeking a Big Data Developer with Hadoop/Oracle experience to act as a key technical leader in our organization.

RESPONSIBILITIES

  • Act as a key technical leader within General Mills
  • Design, create, code, and support a variety of Hadoop ETL solutions (potentially including but not limited to: Python, Scala, Kafka, SAP Data Services, or others)
  • Generate and implement your own ideas on how to improve the operational and strategic health of big data ecosystem
  • Participate in the evaluation, implementation and deployment of emerging tools & process in the big data space.
  • Partner with business analysts and solutions architects to deliver business initiatives.
  • Collaboratively troubleshoot technical and performance issues in the big data ecosystem

Qualifications

MINIMUM QUALIFICATIONS

  • Bachelor’s Degree required
  • Minimum 1 year of IT experience required
  • Strong understanding of Hadoop fundamentals
  • Database development experience using Oracle, SQL Server, SAP BW or SAP HANA
  • Job Scheduling experience
  • Process mindset with experience creating, documenting and implementing standard processes
  • Big Data Development experience using Hive and/or Spark
  • Effective verbal and written communication and influencing skills.
  • Effective analytical and technical skills
  • Ability to work in a team environment
  • Ability to research, plan, organize, lead, and implement new processes or technology

PREFERRED SKILLS/EXPERIENCE

  • Bachelor’s Degree in Computer Science, MIS, or Engineering preferred
  • 2+ years of IT experience preferred
  • Python, Scala or Java development experience
  • Familiarity with Kafka 
  • Familiarity with the Linux operating system
  • Experience with agile techniques and methods

#CB

AUTRES EMPLOIS QUE VOUS POUVEZ AIMER

OTHER JOBS YOU MAY LIKE

Déjà inscrits

Se connecter ou Créer un profil Vous cherchez une mise à jour pour votre postulation? Connectez-vous à votre profil pour voir votre statut.
Rejoignez notre communauté de talents