CURRENT OPENINGS

Data Architect

Data Architect with Bachelor’s degree in Computer Science, Computer Information Systems, Information Technology, or a combination of education and experience equating to the U.S. equivalent of a Bachelor’s degree in one of the aforementioned subjects.

Job Duties and Responsibilities:

  • Define and execute a global analytics and data science technical strategy, architecture and roadmap to deliver on tactical and strategic initiatives partnering with the business and IT leaders to deliver data insight to actionable outcomes.
  • Build and manage the data on-boarding, data platform operations, analytics tools and visualization technologies. Setting the strategy and create the team.
  • Build an internal brand and develop a diverse team of global data engineers, product managers and data scientist to promote a federated and data focused culture across the company. Coach/mentor team members, including dotted-line reports as necessary. Align team members in roles to best take advantage of their strengths and to grow/diversify their skill sets.
  • Collaborate extensively with the business to drive incremental value and innovation through the adoption of analytics and data science techniques. Identify, prioritize and socialize the strategies and roadmap for actionable outcomes.
  • Mature and innovate current AWS data lake platform and scale it to support most complex uses cases of patient care.
  • Enable the organization to focus on continuous operational excellence, analytics architecture evolution and high value technology selection decisions with transparency to business leaders
  • Chair the monthly analytics steering committee and executive stakeholder engagement to ensure alignment on requirements and delivery expectations.
  • Strong understanding of the analytics development lifecycle from requirements, logical and physical data model design, implementation, testing, deployment and support.
  • Drive data literacy, art of the possible and develop support for investments as part of the AOP planning cycle.
  • Partner with Architecture and Data Governance team on solution analysis, business requirements, cost benefit analysis, risk assessment and business impact analysis.
  • Represent the organization as a thought leader on enterprise analytics and data intelligence with strategic engagement partners and industry forums.
  • Develop and maintain program, project and operational reporting to demonstrate quality execution of project commitments (budget, benefits, etc.) and adherence to service level commitments.
  • Hands-on management of cloud hosted and SaaS vendors. Ensure all activity is delivered in compliance with normal policies, procedures, technology standards, security requirements, etc.
  • Develop and integrate metrics into support, enhancements and projects that clearly demonstrate the value of IT to the business.

Work experience required for the position:

  • Hands-on experience in “Big Data” technologies such as Hadoop, Hive, Kafka and Spark, as well as a general understanding of the fundamentals of distributed data processing (data lakes, ETL/data warehousing, DB design).
  • Extensive Data analysis experience with Agile, Waterfall Methodologies and Scrum Project Management, Program Management, Master data Management, Product Owner, Software Development Life Cycle phases from Requirement Gathering, Analysis, Design, Development, Implementation, Integration, and Data Migration from legacy systems to NOSQL and RDBMS Applications.
  • Extensive experience in performing various roles as Data Architect, Modeler, Developer, Data and ETL Modeler and Data Analyst, Agile Project Manager, Product Manager, Program Manager, Technical Team Lead, ETL Architect and Analyst. Must have managed multiple projects with OLAP, OLTP, ODS, EDW, MDM, Data Lake, Data Vault, Data Governance, Data Profiling, Cleaning, Defining and designing Anchor Modeling, Focal Point Data models
  • Experience in Co-ordinating with RDBMS, Hadoop, Web Services, AWS, S3, EC2, Dynamo DB, SOA, REST API, SAP HANA, ETL Development trams, Implementing Single Sign on Architecture.
  • Expert in Designing and Implementing Star, Snowflake, and Galaxy Schema Models.
  • Worked extensively with Teradata, DB2, Oracle, MySQL, SQL Server, Sybase, AWS Redshift, S3, EC2, Snowflake and PostgreSQL, MS Azure, COSMOS DB, Casandra DB,
  • Extensively involved with Application Development, Middleware, Servers, Storage analysis, Database management, Technical and functional Operations of the Business at Enterprise Level.
  • Established Data extraction methodology for Oracle, SQL Server, and MySQL to migrate Bigdata and Hadoop, Snowflake Cloud DB, Time Ttravel, Zero Cloning.
  • Hands on experience and strong proficiency with either Scala, Python or Java.
  • Cloud technologies experience/knowledge is a plus (GCP, AWS, Azure).

Technologies / Environment involved:

  • Distributed storage : Hadoop HDFS, Azure HD Insight
  • Database management : Apache Cassandra, Apache Hive, Apache HBase
  • Graph Processing : JanusGraph, Distributed Graph DB, Apache Spark
  • Machine Learning : Spark Machine Learning Library (MLib), TensorFlow, Keras
  • Data Learning : Apache Nifi, Apache Kafka, Apache Spark, Hadoop MapReduce
  • Programming Languages : Java, Scala, Python
  • DevOps Tools : BitBucket, Apache Maven, Selenium, Jenkins

Work location is Portland, ME with required travel to client locations throughout USA.

Rite Pros is an equal opportunity employer (EOE).

Please Mail Resumes to:
Rite Pros, Inc.
565 Congress St, Suite # 305
Portland, ME 04101.

Email: resumes@ritepros.com