Cloud Data Consultant (Azure/AWS)


MentorMate is an industry veteran that meets complex business challenges with native, hybrid, and custom software development. We think big, design smart and develop fast for all screens, projects, and teams. Our headquarters is located in Minneapolis, Minnesota, we have an office in Sweden, five development offices in Bulgaria, and a network of remote software development partners in more than 20 countries across the globe. With over 1,400 projects completed by our 850+ top software engineers, we innovate in sectors like healthcare, education, finance, agriculture, and beyond.

As a remote partner in our network, you will participate in the creation of enterprise-class applications on the latest technology platforms using proven design patterns. This position requires a solid hands-on developer to fully participate in the software development process, including design, development, unit testing, and technical documentation. You will use the Scrum development methodology to create 21st-century software solutions that set standards. On top of that, you can work from anywhere in the world, part- or full-time, and receive competitive pay.


  • Align architecture with business requirements
  • Analyze, re-architect, and re-platform data warehouses
  • Develop, construct, test, and maintain data pipelines for multiple analytics functions
  • Identify ways to improve data reliability, efficiency, and quality
  • Troubleshoot and resolve complex issues
  • Prepare data for predictive and prescriptive modeling as well as for classic BI analytics
  • Mentor and train colleagues when necessary, by helping them learn and improve their skills, as well as innovate and iterate on best practices


  • 3+ years of experience in Azure Big Data services (Synapse Analytics, Databricks, Azure Data Factory, Azure Machine
  • Learning, Azure Stream Analytics, Data Lake Analytics, Azure Database, Azure Cosmos DB, HDInsight) OR AWS Big Data
  • services (Redshift, Kinesis, Glue DataBrew, Glue, MSK, EMR, Athena, Quicksight, S3, Lake Formation, SageMaker, DMS)
  • 5+ years of experience in data engineering, software engineering, or other related roles
  • 5+ years of experience with ETL, data modeling, and data architecture
  • Strong experience in one or more of the following: SQL/ Python/ Scala
  • Strong experience in generating data pipelines from multiple data sources, in collaboration with diverse team members
  • Experience in operating very large data warehouses or data lakes
  • Understanding of software development best practices, including query optimization, version control, code reviews, and documentation
  • Experience with building data pipelines and applications to stream and process large datasets at low latencies
  • Ability and desire to learn new languages and technologies
  • Attention to detail and strong documentation skills
  • Fluent English both written and verbal

A significant advantage would be any experience with

  • Experience in ETL tools like Airflow, Talend, Matillion, Informatica, ODI, SSIS
  • Usage of Big Data technologies such as Hadoop/ Databricks/ Spark
  • Oracle, MS SQL Server, MySQL, PostgreSQL, Snowflake
  • Design and implement batch and stream data processing pipelines
  • Certification of SQL, PL/SQL, T-SQL
  • Azure Certification

What We Offer

  • Freedom to work remotely from anywhere in the world
  • Opportunity to join a community of 850+ developers worldwide
  • Insured assistance from a personalized account manager
  • Accounting consultations for Bulgaria-based individuals that join the network
  • Clear and fair negotiation on your payment terms
  • No third-party intermediaries. Open communication with our teams.
  • Inspiring opportunities to work on various enterprise projects that set standards


Please enter your comment!
Please enter your name here