Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

OPORTUNIDADES 100% REMOTAS A PARTIR DO BRASIL

1. Data Lake Architect


What will you be doing?
· Design, implement and optimize data lake architectures.
· Build robust, scalable data systems and pipelines.
· Acquire and ingest large volumes of data in near real-time fashion.
· Combine raw information from external & internal sources (APIs, Databases).
· Monitor and ensure data quality of the data lake components and introduce new techniques of
enhancing the overall data lake health.
· Evaluate business needs and objectives in association with product owners and drive the requirement
analysis from a technical perspective.
· Lead the technical analysis and solution design of new features.
· Participate actively in code reviews & ensuring tour craft best practices are followed.
· Identify location/time patterns in geospatial data and present your findings to stakeholders.
· Mentor other/new team members on technical areas and use your technical expertise to level-up the
team
· Be a member of a scrum team working with Agile methodologies.
Who are we looking for?
· BSc, MSc in Computer Science, Electrical/Computer Engineering or any related technical discipline.
· Minimum 5 years of production-level experience in big data manipulation, using a high-level
programming language, e.g. Python/Java/Scala, solving complex problems and delivering quality outcomes
(we use Python).
· Working experience in building robust data pipelines using open source distributed computing
frameworks (Apache Spark, Apache Flink, Dask).
· Working experience in designing, constructing, cataloging and optimizing data lake infrastructures
(e.g. MinIO / Amazon S3, Hive Metastore / Glue Data Catalog).
· Experience with Cloud Technologies and Serverless Computing (we use AWS).
· Familiarity in using Docker for local development and tuning applications deployed on a Kubernetes
cluster.
· Familiarity with performing SQL analytic workloads against cloud data warehouses (e.g. Amazon
Redshift) or data lakes (e.g. Presto, Amazon Athena).
· Excellent understanding of software testing, agile development methodology and version control.
· Excellent understanding of Big Data File Formats (Apache Parquet/Avro/ORC) and how to leverage
the power of their metadata information.
· We are a multinational company, fluency in English is a must.
· We thrive through team collaboration, we are on the lookout for team players.
· We encourage everyone to think out of the box;curiosity and willingness to learn new technologies
and evolve as an individual and as a team member is highly appreciated.

What it would be great to have (a strong plus)


· Working experience in building scalable data streaming applications (e.g. Spark Streaming, Apache
Flink, Amazon Kinesis Data Streams).
· Working experience with a workflow orchestration tool (e.g Airflow, Luigi).
· Professional exposure to SQS/SNS, Apache Kafka or other brokers.
· Knowledge of NoSQL databases, mainly key-value data stores (Redis) and document-oriented
databases (MongoDB).
2. Cloud Infrastructure

What will you be doing?


· Responsible for provisioning and maintaining the cloud infrastructure, and ensure its optimal
operation
· Participate in system architecture discussions and be responsible for proposing and implementing
optimal solutions
· Create and utilize tools to monitor our applications and services in the cloud including system health
indicators, trend identification, and anomaly detection
· Plan, build, & maintain CI pipelines
· Collaborate with development teams to help engineer scalable, reliable, and resilient software running
in the cloud
· Be part of a scrum team and be an important beacon of knowledge and mentorship for less senior
members of your craft
· Automate as much as possible
· Document design decisions prior and ensure adherences during implementation

Who are we looking for?


· minimum 5 years of working experience with AWS services such as Amazon VPC, Amazon ELB, API
Gateway, CloudWatch, CloudTrail, CloudFront, Glue, Lambda functions, Athena or other relevant services
· Experience with Infrastructure Code tools of your choice (Terraform, CloudFormation)
· Experience with CI/CD services (Jenkins, Gitlab CI, Bamboo, BitBucket pipelines)
· Fluent use of scripting languages (Bash, Python, Ruby)
· Strong understanding of modern web app stacks and how they operate
· Strong knowledge of networking, Firewalling, Routing, Load Balancing
· Experience in cloud infrastructure related operations
· Experience with container orchestration platforms (Docker, AWS EKS, AWS ECS, etc)
· A strong sense of ownership over the infrastructure that makes the platform operational
· Be proactive in communicating ideas and challenges. We expect everyone to think out of the box and
be creative beyond the confines of role definitions
· Fluency in English
3. Software Engineering and Architecture Design

What will you be doing?


· Lead system architecture discussions and be responsible for proposing optimal solutions.
· Collaboration with multiple teams to determine functional and non-functional requirements
· Identify and help define common technology enablers across various departments of large
organizations.
· Work with technical teams and other related internal divisions to ensure that the client’s requirements
are captured, understood and considered.
· Support in developing high level technical service design as part of business ideation phase.
· Advise on how to create system and data correlation activities to enable integrated cross area
services.
· Propose recommendations / improvements on Digital Strategy & Technology.
· Participate and support company business growth activities

Who are we looking for?


· Minimal 5 years of experience in engineering and software architecture design.
· Background in any of the following: Infrastructure Architecture, Data Engineering or DevOps
Engineering.
· Experience in the design of solutions for Cloud Platforms & Data Platforms.
· Experience with at least one of the major cloud services provider (AWS, AZURE, GCP)
· In-depth understanding of the business drivers for digital transformation.
· Being up to date with emerging computing trends and assessing their impact on partner practice
building opportunities.
· Experience working with agile methodologies / in agile teams is highly appreciated.
· We are a multinational company, fluency in English is a must.
· We thrive through team collaboration, we are on the lookout for team players.
· We encourage everyone to think out of the box curiosity and willingness to learn new technologies
and evolve as an individual and as a team member is highly appreciated.
OPORTUNIDADES 100% REMOTAS A PARTIR DE PORTUGAL

1. BI Developer Next Gen


Notes: We are looking for candidates with 5 or more years of relevant experience in BI; Please find the
technologies highlighted below which are critical for the Next Gen BI Developer/ Data Engineer role

Role Description Skills:


BI Soft skills:
English level C1
Excellent communication and interpersonal skills
Confident, driven and strong determination to succeed
Extremely well organised, detail oriented when dealing with complex business change
High levels of personal energy
A professional and flexible attitude while multi-tasking at an extremely fast pace
Acts swiftly and decisively
Manages rapidly changing priorities with a calm attitude and professional manner
Shows dedication and commitment to ensure work is completed within timescales
Passionate about providing an exceptional BI experience to our users
A strong customer led work ethic and the ability to work well under pressure
Maintain complete confidentiality and discretion at all times
Seeks to understand and support change
Excellent English language skills, both written and verbal

BI Developer Technical Skills


Experience of building BI solutions with the following BI technology platforms
Microsoft Azure Databricks (Delta Live Tables)
Microsoft Azure Synapse Analytics
Microsoft Azure Data Factory
Microsoft Azure Data Lake
Microsoft Azure Analysis Services / SQL Server Analysis Services
Microsoft SQL Server Database
Microsoft Power BI
Microsoft SQL Server Integration Services (SSIS)
Microsoft SQL Server Master Data Services (MDS) (~)
Microsoft SQL Server Reporting Services (SSRS) (~)
Experience using the following development tools (in a busy team (~))
Visual Studio Database Project
Visual Studio Code
Tabular Editor (~)
GIT Source Control (should be comfortable with safely resolving conflicts)
SQL Server Management Studio
Azure Data Studio (~)
Azure DevOps Boards/Pipelines (~)
Able to author code that is easy for other developers to understand and performs appropriately.
Languages
DAX
T-SQL
Markdown
PowerShell (~)
Python (~)
C# (~)
CSV (~)
JSON (~)
XML (~)
Regular Expressions (~)
Solid understanding of Kimball data warehouse design principles allowing confident explanation of
design decisions and rejected alternatives
Able to understand business processes and translate those into a kimball model
Understanding of both manual and automated test practices to assure system integrity
Able to use efficiently use data analysis tools and methods to understand and validate data according to
the question in-hand
2. DevSecOps
Mandatory Skills:

• DevSecOps, integrating security into CI/CD pipelines

• Automation of security controls and standards

• Familiar with integrating multiple security tools into CI/CD Pipelines (Gitlab preferred)

• Working experience to implement and test automation scripts and setups

Role Description Skills:

Responsibilities

• Support secure application development practices and a secure development mentality

• Identifying, communicating, and providing targeted remediation of vulnerabilities

• Developing and updating security patterns aligned with security requirements

• Identifying application security requirements for projects

• Coordinating and collaborating with multiple teams to ensure the confidentiality, integrity, and
availability of Prudential assets that meets business needs

• Performing other security-related projects that may be assigned according to skills

• DevSecOps, integrating security into CI/CD pipelines

• Automation of security controls and standards

Skills

• Familiar with integrating multiple security tools into CI/CD Pipelines (Gitlab preferred)

• Working experience to implement and test automation scripts and setups

• Familiar with integrating security tools and providing vulnerability assessments. Leveraging tools such
as Burp Suite Enterprise, Checkmarx, NowSecure, OWASP ZAP

• Understanding of OWASP Top 10 and SANS Top 25 vulnerabilities and how to remediate

• Working knowledge of using API to interact with web services provided by tools

• Conduct tool evaluations and build proof of concepts

• Integrate with reporting tools to provide consolidated view

• Ability to turn technical standards into working practice

• Assist in driving consistency and standardization of DevSecOps services across the enterprise
• Strong Automation skills, Python and Terraform preferred.

• Maintain documentations and user guides

• Knowledge of security within cloud environment, especially around networking, security and
administration

• A motivated and flexible approach to work in an adapting fast-moving Agile environment utilizing
technology and tools such as Jira, Jira Align, Miro, Confluence.

• Can demonstrate strong performance ethos and personal commitment for outstanding customer
service

• Ability to interface with both technical and non-technical teams

• Willingness to train and upskill on a continuous basis

• Excellent communication, time management and organizational skills

You might also like