Full-timeRemote allowed

Edvantis is looking for a Data Engineer/ETL. Our client is an award-winning Applied AI and Big Data software and services company, driven by a deep desire to solve transformational problems at the heart of businesses. Our signature approach combines groundbreaking machine-learning research with disciplined cloud and data-engineering practices to create breakthrough impact at unprecedented speed.

Responsibilities:

  • Position functions as a lead Developer inside the Data Management Team to create scalable ETL and reporting solutions that meet the business needs of the organization
  • Partner with Business Analysts, Subject Matter Experts from Business Units, and counterparts in IT to complete system requirements, technical specifications, and process documentation for assigned projects
  • Review, analyze, and evaluate the scope of the business problems presented and help identify viable technical solutions
  • Develop ETL and data warehousing solutions with the Infoworks product and services
  • Drive data warehouse architectural decisions and development standards
  • Create detailed technical specifications and release documentation for assigned ETL projects
  • Ensure data integrity, performance quality, and resolution of data load failures
  • Multi-task across several ongoing projects and daily duties of varying priorities as required
  • Provide input into each phase of the system development lifecycle as required
  • Ensure adherence to published development standards and Hadoop best practices, resulting in consistent and efficient implementation of project components
  • Ability to make good judgments, decisions, negotiate and good problem solver

Requirements:

  • 8+ years in IT area
  • 3 years experience on DBMS concepts and good knowledge of writing SQL and interacting with RDBMS and NoSQL database – HBase programmatically
  • Design and development experience in Java/Core Java related technologies
  • Experience in Hadoop and NoSql solutions
  • Experience with ETL and ELT tools
  • Experience working on Azure, AWS or GCP Cloud environments
  • English – Upper-Intermediate or higher

Will be a plus:

  • Background in all aspects of software engineering with strong skills in parallel data processing, data flows, REST APIs, JSON, XML, and micro service architecture.
  • Hands-on design and development experience on Big data related technologies – Hadoop, PIG, Hive, MapReduce & Web Services
  • Hands on Knowledge of working with Teradata, Netezza, Oracle, DB2 databases
  • Strong understanding of File Formats – ORC, Parquet, Avro File formats
  • Strong understanding and hands-on programming/scripting experience skills – UNIX shell, Python, Perl, and JavaScript.
  • Should have worked on large data sets and experience with performance tuning and troubleshooting
  • Bachelor’s degree or foreign equivalent required from an accredited institution. Will also consider years of progressive experience in the specialty in lieu of Bachelor’s Degree

We offer:

  • Career and professional growth
  • Competitive salary
  • Friendly and highly professional teams
  • Big and comfortable office, own parking area, restaurants nearby
  • Medical insurance coverage for employees (Сovid-19 included), plus an option for family insurance coverage at a corporate rate
  • Paid 12 sick leaves and all holidays
  • Paid 18 working vacation days
  • English/German language courses
  • Comfortable office facilities (kitchens, shower, sports activities: athletics, coffee/tea points, food delivery services, etc.)

To Apply for this Job, Fill in the Form and Upload Your Resume

    This is a required field
    This is a required field
    This is a required field
    This is a required field
    The telephone number is invalid