Eaton Careers

Senior Software Engineer-Big Data

Dublin, Ireland
Engineering

Apply
English

Job Description

Eaton is proud to announce the launch for our new cutting edge Research and Development Center based in our global headquarters in the heart of Dublin. In the age of Big Data and the Internet of Things, our Center for Intelligent Power focuses on technical capabilities to help develop next-generation power management products and services.



We're now looking for a Senior Software Engineer-Big Data who is passionate about his or her craft. The Senior Software Engineer-Big Data will be involved in the architecture, design, and management of edge and on-premises data and control processing systems for streaming time-series and batched data. In addition to building and maintaining these systems, the Senior Software Engineer-Big Data will also work to ensure that data scientists and business users across the enterprise can readily access data when and where it is needed.

Your key deliverables:
  • Working with your team and others, contributing to the architecture, design, and management of secure, high-performance edge and on-premises appliances for time-series data processing and time-bounded control loop operation
  • Recommending, setting up, integrating and utilizing streaming data processing pipeline tools and frameworks as required for business and data science needs
  • Recommending and setting up appropriate performance monitoring solutions; author and implement based on the results
  • Demonstrating and documenting solutions by using flowcharts, diagrams, code comments, code snippets, and performance instruments.
  • Evaluating business requirements to determine potential solutions.
  • Author high-quality, highly-performance, unit-tested code to extract and transform data based on business and data science needs
  • Working directly with stakeholders, engineering, and test to create high quality solutions that solve end-user problems.
  • Mentoring others in the use of your tools and techniques
  • Providing work estimates and participate in design, implementation, and code reviews
  • Developing and execute agile work plans for iterative and incremental project delivery
  • Exploring and recommending new tools and processes which can be leveraged across the data preparation pipeline for capabilities and efficiencies
  • Integrating multiple sources of data through efficient extract, transform, & load and other workflows
  • Collaborating broadly across multiple functions (data science, engineering, product management, IT, etc.) to readily make key data readily available and easily consumable

Qualifications


Do you have?

 

Essential:

  • Bachelor’s degree or higher in computer science or software engineering
  • Previous experience in developing and designing technology solutions
  • 5+ years’ experience in the software industry as a developer, with a proven track record of shipping high quality products
  • 2+ years of experience preparing big data infrastructures, integrating data from disparate systems, containerisation (e.g. using Docker), and managing these systems
  • Ability to write complex queries that are accessible, secure, and perform in an optimized manner with an ability to output to different types of consumers and systems
  • Strong knowledge of Scala to perform in-place processing of streaming data
  • Solid understanding of relational and non-relational database systems
  • Experience with in-memory, file-based and other data stores
  • Solid understanding of Java and/or Python and associated IDEs (Eclipse, IntelliJ, etc.)
  • Thorough understanding of relational and non-relational database systems
  • Demonstrable experience with Agile development methodologies and concepts
  • Strong problem solving and software debugging skills
  • Experience building APIs to support data consumption needs of other roles
  • Excellent verbal and written communication skills including the ability to effectively explain technical concepts
  • Abreast of upcoming software development/engineering tools, trends, and methodologies
  • Good judgment, time management, and decision-making skills

Desirable:


  • Knowledgeable in leveraging multiple messaging & service protocols and related technologies (ZeroMQ, AMQP, MQTT, OPC UA, RESTful queries)
  • Apache ecosystem knowledge particularly Cassandra, MapReduce/Spark and related frameworks e.g. Play Framework.
  • Knowledge of cloud development platforms such as Azure or AWS and their associated data storage options
  • Knowledge of MongoDB, Document DB, CosmosDB

Then we want to hear about you!

 

What Eaton offers:


  • Competitive compensation and benefits package
  • Contract in fast growing global company
  • Challenging projects in dynamic collaborative team
  • Great promotional opportunities  – We encourage internal promotion, whenever possible

 

Candidate applying to the vacancy will be subject of the background screening.


#LI-MM2 


We make what matters work. Everywhere you look—from the technology and machinery that surrounds us, to the critical services and infrastructure that we depend on every day—you’ll find one thing in common. It all relies on power. That’s why Eaton is dedicated to improving people’s lives and the environment with power management technologies that are more reliable, efficient, safe and sustainable. Because this is what matters. We are confident we can deliver on this promise because of the attributes that our employees embody. We’re ethical, passionate, accountable, efficient, transparent and we’re committed to learning. These values enable us to tackle some of the toughest challenges on the planet, never losing sight of what matters.

Job: Engineering

Region: Europe, Middle East, Africa
Organization: CTO Corporate Technology Office

Job Level: Individual Contributor
Schedule: Full-time
Is remote work (i.e. working from home or another Eaton facility) allowed for this position?: No
Does this position offer relocation?: Relocation from within hiring region only
Travel: Yes, 10 % of the Time

Requisition ID: 061578