Sr. Engineer Edge Analytics– Meters & Gateway Software

Job Description


Description:


If you desire to be part of something special, to be part of a winning team, to be part of a fun team – winning is fun.  We are looking forward to a Senior Engineer based in Pune, India. In Eaton, making our work exciting, engaging, meaningful; ensuring safety, health, wellness; and being a model of inclusion & diversity are already embedded in who we are - it’s in our values, part of our vision, and our clearly defined aspirational goals.  This exciting role offers opportunity to:


  • Develop advanced machine learning, data analytics, AI and optimization algorithms and apply them to create new intelligent devices, solutions and services for Eaton’s MRIoT business.


·       Derive, interpret and effectively deliver the results of data analysis using visualization techniques, tools, custom applications, or by narrating stories about the solutions to business problems.


·       Understand traditional and new data analysis methods that can build statistical models and discover patterns in data.


·       Utilize a broad range of techniques in machine learning, data mining, statistics, and big data interrogation methods.


·        Explore data, ask the right questions, and provide compelling findings.


·        Provide work estimates and participate in design and implementation reviews.


·        Develop and execute agile work plans for iterative and incremental project delivery.


 


Qualifications

Requirement:

  • Bachelor’s in computer science / electronics/ E&TC/ Instrumentation engineering

5+ years of experience in more than one programming language (R,Python, Java, C++, C#, etc.).


      Deploy modern data management tools to curate our most important data sets, models and processes, while identifying areas for process automation and further efficiencies. 


      Evaluate, select and acquire new internal& external data sets that contribute to business decision making .


      Engineer streaming data processing pipelines


      Drive adoption of Cloud technology for data processing and warehousing


      Experience with cloud edge providers Azure/AWS/Google etc.


      Ability to specify and write code that is accessible, secure, and performs in an optimized manner with an ability to output to different types of consumers and systems


      Strong knowledge of big data query tools to perform ad hoc queries of large datasets


      Solid understanding of relational and non-relational (NoSQL, time-series) database systems


      Experience with in-memory, file-based and other data stores


      Solid understanding of Java and/or Python and associated IDE’s (Eclipse, IntelliJ, etc.)


      Ability to specify and write code that is accessible, secure, and performs in an optimized manner with an ability to output to different types of consumers and systems


       Strong knowledge of big data query tools to perform ad hoc queries of large datasets


       Solid understanding of relational and non-relational (NoSQL, time-series) database systems


       Experience with in-memory, file-based and other data stores


       Solid understanding of Java and/or Python and associated IDE’s (Eclipse, IntelliJ, etc.)


       Extensive experience with Agile development methodologies and concepts


       Strong problem solving and software debugging skills


       Experience building APIs to support data consumption needs of other roles


       Deep understanding of multi-dimensionality of data, data curation and data quality, such as traceability, security, performance latency and correctness across supply and demand processes


       In-depth knowledge of relational and columnar SQL databases, including database design


       Collaborate broadly across multiple functions (data science, engineering, product management, IT, etc.) to readily make key data readily available and easily consumable.


       Programming skills in C++, Programming of Fieldbus systems (Profibus, Modbus, Codesys).


       Knowledge of IoT technologies, including cloud processing, like Azure IoT Hub.


       Knowledge of data analysis tools, like Apache Presto, Hive, Azure Data Lake Analytics, AWS Athena, Zeppelin


       Knowledgeable in leveraging multiple data transit protocols and technologies (MQTT, Rest API, JDBC, etc)


        Knowledge of Hadoop and MapReduce/Spark or related frameworks.


        Knowledge of MongoDB, Document DB, Cosmos DB.


       Good interpersonal and communication skills.


       Good communication skills in English (verbal and written) to communicate information effectively to customers, and technical staff


 

Yes! Because you are the one we are looking for, we hope to hear from you now!


#LI-MY1

We make what matters work. Everywhere you look—from the technology and machinery that surrounds us, to the critical services and infrastructure that we depend on every day—you’ll find one thing in common. It all relies on power. That’s why Eaton is dedicated to improving people’s lives and the environment with power management technologies that are more reliable, efficient, safe and sustainable. Because this is what matters. We are confident we can deliver on this promise because of the attributes that our employees embody. We’re ethical, passionate, accountable, efficient, transparent and we’re committed to learning. These values enable us to tackle some of the toughest challenges on the planet, never losing sight of what matters.

Job: Engineering

Region: Asia Pacific
Organization: INNOV Innovation Center

Job Level: Individual Contributor
Schedule: Full-time
Is remote work (i.e. working from home or another Eaton facility) allowed for this position?: No
Does this position offer relocation?: Relocation from within hiring country only
Travel: No