Arch

  • Data Platform Engineer

    Job Locations US-NC-Raleigh
    Job ID
    2018-2163
    # of Openings
    1
    Category
    Information Technology
  • Overview

    The Company

     

    Arch Capital Group Ltd. is a Bermuda-based company which provides insurance, reinsurance and mortgage insurance on a worldwide basis.  Arch Capital Services Inc. provides support and expertise to entities across ACGL to help them operate effectively and efficiently. Arch is committed to helping its associates create what’s next by providing access to a variety of programs supporting your professional development and a culture that encourages innovation, collaboration and professional growth. We seek talent that thinks innovatively, values collaboration and will go the extra mile to serve our customers and develop our company.

    Responsibilities

    The Position

    The Data Platform Engineer will work closely with analytic platform architects, analysts, and development groups to create customized data pipelines using services such as Azure Data Factory and Azure Databricks to integrate and orchestrate data transformation activities. Best practices and design patterns will be utilized to ensure the success of projects and reusability of code across different Business Units. This Data Platform Engineer requires excellent communication skills, internal customer interface, and a moderate to high level of programming, scripting, and configuration of Azure cloud services. Flexibility, attention to detail, capability to work with a geographically diverse team, and the ability to adapt and learn quickly in various technical and creative environments is critical.

     

    Job Responsibilities

    • Design and implement core data analytic platform components that would be shared extended to different analytics group within Arch
    • Review approaches and completed data pipelines against platform best practices and patterns
    • Maintain and design a common data flow pipeline in Azure Data Factory for data transformation activities such as Extract Transform Load (ETL)
    • Support and troubleshoot the data flow activities on cloud data warehouse environments
    • Develop data pipeline code using SSIS and/or Azure Data Factory and database stored procedures
    • Perform requirements planning, monitoring, and end-to-end requirements management throughout the data asset development life-cycle
    • Direct and help other developers and analyst to ensure data platform patterns are adhered

    Qualifications

    Desired Skills

    • Experience or knowledge of the following:
      • Designing, building, and documenting RESTful API using OpenAPI specification tools
      • Microsoft PowerBI, Qlick, or Amazon QuickSight
      • Snowflake Data or MS SQL Data warehouse
      • AWS RedShift
      • Apache Spark or Hadoop
      • SparkR
      • R/R Studio
      • Linux/PowerShell scripting
      • AWS Lambda or Azure functions

    Required Education/Experience 

    • Bachelor’s degree in computer Science or related field
    • 1-2 years’ experience in the following:
      • Azure Data Factory and Databricks
      • C#, .NET, Python, or Java
      • Data integration service, SQL, and Extract Transform Load (ETL)
      • Azure or AWS for development and deployment
      • Jira or similar tool
      • Code repository in Git/GitHub or Bitbucket
    • Experience working with a data warehouse
    • Familiarity with structured and semi-structured data formats including JSON, Avro, ORC, Parquet, or XML
    • Exposure to working in an agile work environment

    Options

    Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
    Share on your newsfeed