CLOUD DATA ARCHITECT
Seeking an experienced Data Architect who has experience working with Cloud technologies, specifically AWS, and building data lakes for an enterprise level organization. The ideal candidates must also be hands-on and will be responsible for the architecture and IaC. The Cloud Data Architect will be an integral part of a small architecture team and will be driving the firm-wide strategic evolution of the organization’s cloud Data Architecture focusing on cloud solutions and implementations. You will act as the Cloud Data SME and your efforts will help advance the firm’s capabilities to deliver the strategy for cost and performance optimized solution architectures as relates to data lakes, analytics, and machine learning, ensuring our technology solutions safeguard the security, privacy, and integrity of our firm. Excellent communication skills are required as you will collaborate with the Data Science, Engineering and Product teams to anticipate future challenges and to provide thought leadership and guidance on the data implications of their solutions.
Lead architecture solutioning on data related projects and programs by creating AWS strategies and solution architectures to accomplish objectives.
Collaborate with stakeholders to define a comprehensive "Data Platform " to cover strategic use cases and enable self-service consumption of the platform.
Develop the Data Lake and overall Analytics architectural strategy, including Reference Architectures and Reference Implementations to establish and grow the platform.
Partner with Data Scientist, Analysts, Architects, and Engineers to produce Solution Architectures that leverage the Data Platform in an optimized, low barrier to entry manner.
Ensure orderly introduction of an increasing pipeline of relevant data to the Data Lake.
Build and manage technical roadmap to achieve strategic objectives on time and within budget.
Develop & maintain data governance framework based on best practices & industry standards
Identify and implement processes, tools, and methods that support the target strategy, address data lifecycle, cover data movement, data security, data privacy, and metadata management.
Implement solutions using Infrastructure as Code to ensure repeatable, consistent, and deterministic outcomes.
Develop fully automated solutions using AWS Cloud Native Services that enable performance and cost optimized solutions.
Architect solutions for cloud integration of custom and enterprise applications.
Identify and adopt best practices, architectural solution patterns, new technologies to address current and future needs, and cloud based cost management strategies.
Embrace DevOps practices / CALMS principles.
Communicate architectural information to non-technical audiences.
BA/BS degree in Computer Science, related Software Engineering or equivalent experience
4+ years Software Development experience, including 2+ years of Python experience.
4+ year Solution Architecture experience, including 2+ years Data Architecture
3+ years hands-on AWS experience, including 2+ years hands-on AWS experience in CodePipeline and/or CloudFormation
3+ years hands-on experience in relational database platforms. 1+ years hands-on experience in non-relational database platforms.
2+ year Solution and/or Enterprise Architecture experience, including and ETL, Data Quality Security and Governance practices.
1+ years architecture experience migrating both custom built & enterprise apps to AWS.
Self-starter / autonomy is a must. Demonstrated ability to self-teach, learn, and apply learnings as a regular course of activity.
Experience and expertise with data visualization and data modeling tools.
Experience building & operating data lakes and data warehouse solutions preferably on AWS.
A well-grounded knowledge of engineering and continuous delivery practices using modern software development tooling (GitHub, CodeCommit, IDEA, PyCharm, VisualStudio, etc….), processes (e.g. Scrum, Agile), and toolsets (e.g. JIRA, Confluence)
AWS Certified, preferably SA and/or DevOps
Ability to learn new tech quickly and effectively.
Hands-on experience with any of the following AWS services: SageMaker, Batch, Athena, Glue ETL/Catalog, QuickSight, RedShift
Data Science and Analytics programming language experience (Client, Python, R, etc.)
Knowledge of big data technologies and frameworks (i.e. Hadoop, Spark, Hive, Kafka, etc.)
Familiarity with MSFT PowerBI migrations
Understanding of IAM Security, Policies, and Roles
Experience and/or demonstrated interest in AWS Lambda, CLI, EC2, RDS, Route53, DynamoDB, SQS, SNS
Active in technical communities (e.g. AWS Meetups), & development of publications (e.g. Slideshare, Medium)
|Ethical Hacker - Mobile|
|Global Security Engineer|
|Project Manager - eDiscovery|
|Help Desk Support|