Database Architect

This is the heading
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.

Database Architect
Job description
USC/GC due to legal or govt. contract requirement.
W2/Own corp.
Job description:
The core skillset needed Snowflake, informatica for ETL, Erwin for data modeling and Oracle Database. They must have enterprise experience doing 3rd party integrations.
Job Description:
Skills: (All Skillsets at least 5-7 years)
• Knowledge of SQL language and cloud-based technologies (SQL – 10 years)
• Data warehousing concepts, data modeling, metadata management
• Data lakes, multi-dimensional models, data dictionaries
• Performance tuning and setting up resource monitors
• Snowflake modeling – roles, databases, schemas
• SQL performance measuring, query tuning, and database tuning
• ETL tools with cloud-driven skills
• Ability to build analytical solutions and models
• Root cause analysis of models with solutions
• Hadoop, Spark, and other warehousing tools
• Managing sets of XML, JSON, and CSV from disparate sources
• SQL-based databases like Oracle SQL Server, Teradata, etc.
• Snowflake warehousing, architecture, processing, administration
• Data ingestion into Snowflake
• Enterprise-level technical exposure to Snowflake applications
Responsibilities:
• Create, test, and implement enterprise-level apps with Snowflake
• Design and implement features for identity and access management
• Create authorization frameworks for better access control
• Implement novel query optimization, major security competencies with encryption
• Solve performance issues and scalability issues in the system
• Transaction management with distributed data processing algorithms
• Possess ownership right from start to finish
• Build, monitor, and optimize ETL and ELT processes with data models
• Migrate solutions from on-premises setup to cloud-based platforms
• Understand and implement the latest delivery approaches based on data architecture
• Project documentation and tracking based on understanding user requirements
• Perform data integration with third-party tools including architecting, designing, coding, and testing phases
• Manage documentation of data models, architecture, and maintenance processes
• Continually review and audit data models for enhancement
• Maintenance of ideal data pipeline based on ETL tools
• Coordination with BI experts and analysts for customized data models and integration
• Code updates, new code development, and reverse engineering
• Performance tuning, user acceptance training, application support
• Maintain confidentiality of data
• Risk assessment, management, and mitigation plans
• Regular engagement with teams for status reporting and routine activities
• Migration activities from one database to another or on-premises to cloud
Job description
Salary:
Industry
Location
[Baltimore, MD, 21212]
Salary
Experience
Expiry
June 26, 2023
Dont delete this
test