Our client company is looking to reinforce his Data Analytics team.
– Define and design modern data platforms in public cloud or in private cloud covering the entire data life cycle under a strategy directed by metadata and based on DataOps;
– Deploy, integrate and configure all those components for the availability of said data platforms, understanding in depth their use and application in the real context of descriptive and predictive analytics, and guaranteeing the characteristics required by based on all of this,applying for this DataOps techniques together with teams of cloud engineers;
– Define at a methodological and government level how the exercise of data architecture on these platforms should be considered;
– Develop end-to-end data management solutions as part of multidisciplinary teams,collaborating in the definition of the solution and carrying out the implementation of all the processes involved in the construction and application;
– Design and develop data models at a logical and physical level in different industries and in different technological blueprints with the best work ethodologies for each case in addition to the most appropriate data strategies such as Kimball, Data Vault, Data Mesh or Big Tables Know, manage and use the different tools and platforms available in the different clouds for the development of data management solutions: Azure Data Factory and Azure Synapse, AWS, Redshift, GCP Data Fusion, GCP Dataform and Google Bigquery, and other ecosystem services of data from the different clouds such as Snow Flake or Firebolt;
– Know, manage and use the different tools and platforms available in more legacy environments for the development of data management solutions mainly: Talend Cloud, Oracle, Exadata, Informatica Cloud, etc;
– Apply new practices and emerging technologies from the development of more traditional software or the SaaS or native K8s world, to the data management dimension: Python, Docker, Airflow, Great Expectations, DBT Cloud, FiveTran, Argo, etc;
– Participate in the choice of new data platforms, new development strategies, new methodologies, new partnerships with the company’s Lead and the innovation department;
– Participate together in the impulse department and the different client teams and industry experts in the development of commercial proposals from the business development role, defining the solution, the logical and physical functional architecture and the estimation of efforts and operating costs;
– Co-work with the innovation department in the task of refining and increasing the portfolio through the development of engineering artifacts applicable to the data and analytics ecosystem.
– Computer Engineering, Telecommunication Engineering or other academic output that has endowed you with the capacity for development and critical thinking in this environment – It is highly valued having training through a Master or other alternatives that increase your skills, for example, a Master in Big Data and Analytics;
– Knowledge of architecture and data life cycle in Data Lake environments, Data Warehouse or other Data platforms;
– Knowledge of different data modeling strategies: relational modeling, Kimball modeling, Data Vault modeling, plank modeling, hybrid data modeling, etc;
– Knowledge of the different strategies for moving data to or from a central data repository, this is Data Collection and Data Delivery and the different techniques associated with each of them;
– Change Data Capture, File Ingestion (Parquet, Avro, Text- File, etc.) in batch or microbatch mode,treatment of events and their schemes, Ingest from APIs, Data Sharing, etc.