The International Committee of the Red Cross (ICRC) is a neutral, impartial and independent humanitarian organization that works to protect and assist people affected by armed conflict and other situations of violence. To find out more about ICRC mission and its activities visit www.icrc.org. The ICRC ICT division is responsible for designing, implementing and supporting ICT solutions for more than 10'000 of its employees worldwide.
Following ICRC digital transformation to become data-driven institution, ICRC is looking for SQL Data Engineer within its ICT Data & CRM sector to support growing data platform used to store semi/unstructured data (HDFS, MongoDB, Elasticsearch) and structured data set (ODS/Datawarehouse). SQL Data Engineer will play an integral part within BI/Data Engineering in the design, data modelling and development of ODS/Datawarehouse ETL processes while leveraging existing and new technologies. He/she will work in close collaboration with Business Analyst and Tableau Visualization Experts to gather the data requirements but as well with Artificial Intelligence team to provision raw/curated data set into SQL, Elastic Search or MongoDB.
ICRC “Cloud first” strategy will allow to scale data repository leveraging Azure capabilities. He/she will be given the opportunity to start into early stage of this transformation.
The BSSC is looking for a suitable candidate to fill the following position:
SQL Data Engineer
- Be responsible for the architecture and design of enterprise Data Warehouse processes following the development standards
- Design and develop Extraction, Transformation and Loading (ETL) processes to acquire and load data from internal and external sources using Microsoft SSIS and SQL Procedures.
- Be accountable for all aspects of the data warehouse processes including data modelling, ETL, data validation and implementation.
- Document and maintain business definitions, data dictionary, and data mapping.
- Implement data source integrations using various Web Services like REST and SOAP.
- Develop and implement programs in various languages like SQL stored procedures, C#, Python, etc.
- Evaluate data process optimization opportunities and develop recommendations and improvements.
- Bachelor’s Degree in Computer Science, Information Systems or a related field. An equivalent combination of education and experience may be considered.
- 5-8 years of hands-on experience developing and testing Data Warehouse and Business Intelligence solutions
- Significant experience with data modelling skills (conceptual, logical, and physical model design, Operational Data Stores, Enterprise Data Warehouses and Data Marts).
- Knowledge of SQL and query optimization for data platforms such as SQL Server, Azure SQL DW.
- 5+ years of experience with the BI/DW lifecycle components including Source Data Analysis, ETL, ODS, Data Marts.
- 2+ years’ experience with programming languages (Python, C#, PowerShell Scripting)
- Knowledge of Azure Data factory, Azure Data Lake and Azure SQL Data Warehouse is a plus
- Advanced use of Microsoft TFS 2017, particularly the management of PBI (creation, follow-up), and Visual Studio (source management)
- Experience with Big Data technologies, AWS or Azure cloud environment, and Real-Time Integration methodologies will be plus
- Experience working with a variety of data sources such as MySQL, SQL Server, PostgreSQL, HDFS, and MongoDB will be a plus
WHAT WE OFFER
- A work and progressive professional development in an exciting ICT environment using the latest technologies
- An inspiring opportunity to practice your profession in a humanitarian and multicultural organization
- Stimulating benefits package
If you are interested in this position, please send us your CV and a Motivation letter in English.
Only short-listed candidates will be contacted.
Deadline for applications: 25.12.2019.