Budget - USD 70/Hourly
Contract Duration - 3 to 6 months
Employment Type - Full time
Experience - 10 Years
Availability - Immediately
Timezone - (UTC-03:00) America/Argentina/Ushuaia
Work Mode - Onsite - Client location
Work Location - Irving, TX, USA
Identity verified
Email verified
Phone verified
Profile completed
Interviewing : 2
Timezone : (UTC-03:00) America/Argentina/Ushuaia
Total Job Posted : 4
Last viewed by client : 34 minutes ago
We are looking for a Data Architect who can lead end-to-end enterprise data architecture and modeling initiatives across cloud platforms, with a strong focus on Google Cloud Platform (GCP). The candidate will work closely with business, engineering, and QA teams to support the development of strategic data products and drive advanced analytics and ML initiatives.
Key Responsibilities
• Understand business domain data and its application in metrics, analytics, and AI/ML solutions
• Ensure data design integrity and governance across strategic projects and data products
• Design conceptual, logical, and physical data models with a strong focus on dimensional data design
• Apply data modeling techniques including Data Flow, ER Diagrams, and metadata management
• Implement and manage data taxonomy and security classification/protection
• Build effective partnerships with business teams, Data Engineering, and other stakeholders
• Work with cross-functional teams to deliver data solutions in production environments
• Perform data mapping from Teradata, Oracle, SQL Server, and semi-structured sources to target systems
• Document complex business rules and maintain accurate source-to-target mappings
• Conduct data profiling and analysis with the ability to present pivoted results in Excel
• Maintain and update JIRA tickets regularly
Required Skills
• Strong expertise in data architecture and design using Google Cloud BigQuery (mandatory)
• Hands-on experience in GCP data design and migration
• GCP Certification or equivalent practical experience
• Deep knowledge of Google Cloud Services including:
o Streaming/Batch processing
o Cloud Storage
o Dataflow
o DataProc
o DFunc
o BigQuery
o BigTable
• Java-based development experience with DataProc & Dataflow on GCP
• Expertise in serverless data warehousing concepts on GCP
• Experience handling structured and unstructured data sources
• Familiarity with technologies like Kafka, StreamSets, Collibra, MapReduce, Hadoop, Spark, Flume, Hive, Impala, and Spark SQL
• Proficiency in using data modeling tools such as Erwin Data Modeler (or equivalent) to create architecture diagrams and documentation
• Working knowledge of Teradata, AWS, and GCP data ecosystems
• Advanced SQL skills