Our Ability Jobs

Job Information

Trane Technologies Data DevOps Administrator in Minneapolis, Minnesota

Data DevOps Administrator

Minneapolis MN 314 W 90th St, Minneapolis, Minnesota, United States

Engineering

Requisition # 2201824

Total Views 227

AtTrane TechnologiesTM and through our businesses includingTrane®andThermo King®, we create innovative climate solutions for buildings, homes, and transportation that challenge what’s possible for a sustainable world. Were a team that dares to look at the worlds challenges and see impactful possibilities. We believe in a better future when we uplift others and enable our people to thrive at work and at home. We boldly go.

Thermo King is seeking a Data DevOps Administrator to join an entrepreneurial technical team that are applying innovative analytic approaches to extremely complex problems in Industrial Internet of Things (IIoT) space. The team is responsible for managing and maintaining the machine data pipeline for the business and enabling advanced analytics capabilities. This position will be responsible for solving complex data processing problems to ensure quality data is available for applications and analysis.

The Data Engineer will work with the Cloud Architect to ensure that our data infrastructure is able to meet the demands of our internal users and external-facing applications. They will be responsible for managing our data transformation and BI platforms and ensuring that the underlying data architecture is optimized to meet end user needs. In addition, the Data Engineer will investigate data errors across platforms and develop the data quality management processes necessary to ensure high quality data availability. This is an opportunity to be on the ground floor of DataOps innovation for a business that is growing its data assets at a rapid pace.

Key Responsibilities:

  • Create and maintain optimal data pipeline architecture.

  • Assemble large, complex data sets that meet functional / non-functional business requirements.

  • Identify, design, and implement internal process improvements automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS and Azure big data technologies.

  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.

  • Resolves issues assigned to the Data Ops Team and Analytics team across multiple technologies such as Alteryx, Tableau, application development, and scripting languages.

  • Manage Alteryx and Tableau environments and track performance metrics; assess areas for performance improvement and execute on those plans.

  • Create and maintain specifications and process documentation to produce the required data deliverables (data profiling, source to target maps, ETL flows).

  • Work with internal teams to design the data architecture necessary to ingest new data sources and deliver them to non-technical users quickly and with high quality.

Key Technical Skills

  • Bachelors or master’s degree in Computer Science, Information Systems, Engineering or equivalent.

  • 3+ year’s working with multiple Big Data file formats and Databases

  • Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.

  • Experience with stream-processing systems: Storm, Spark-Streaming, Kinesis, Kafka etc.

  • Write complex SQL queries and optimizing them to pull the required data in an effective way.

  • 5+ years of relevant experience with detailed knowledge of data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools.

  • 3+ Years of development experience on AWS Cloud Services.

  • Working experience on AWS Redshift, MS SQL Server, Oracle or equivalent RDBMS and NoSQL DB’s.

  • Strong development experience on Shell scripting and Windows Scripting

  • 2+ Years of Working experience using DevOps Tools and hands on working experience on CI/CD tools.

  • Build project-specific data pipelines (ETL processes) and validation tools using Python, SQL and AWS cloud technologies

  • Experience creating and running SQL queries, stored procedures, and basic DB administrative tasks.

  • Experience in Relational Databases, Relational Data model, SQL and concepts of implementing Data Level security in RDBMS

  • Experience in the following AWS Services AWS Glue ETL jobs, AWS Lambda, AWS Lake formation, Redshift, AWS RDS, AWS MySQL, S3, Snowflake, Pyspark, EMR

  • 3+ years’ experience in any of the Big Data Distributed ecosystems (Hadoop, SPARK, Hive & Delta Lake), Databricks.

  • Experience in monitoring tools like Splunk, Grafana, CloudWatch etc. is a plus.

  • Working experience with container management frameworks such as Docker, Kubernetes, ECR etc.

We offer competitive compensation and comprehensive benefits and programs. We are an equal opportunity employer; all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, pregnancy, age, marital status, disability, status as a protected veteran, or any legally protected status.

Base annual salary is $78,000 - $151,000.

We offer competitive compensation and comprehensive benefits and programs that help our employees thrive in both their professional and personal lives. We are proud of our winning culture which is inclusive and respectful at its core. We share passion for serving customers, caring for others, and boldly challenging what’s possible for a sustainable world.

We are committed to achieving workforce diversity reflective of our communities. We are an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identify, national origin, pregnancy, age, marital status, disability, status as a protected veteran, or any legally protected status.

DirectEmployers