Contact us for help?

Contact with us through our representative or submit a business inquiry online.

Contact Us

Informatica Data Quality (IDQ) Developer

Informatica Data Quality (IDQ) Developer with Bachelor Degree in Computer Science, Computer Information Systems, Information Technology, or a combination of education and experience equating to the U.S. equivalent of a Bachelor’s degree in one of the aforementioned subjects.

Job Duties and Responsibilities:

  • Analyze data related to different applications Salesforce (Sales & Marketing Cloud), Oracle HR, CANSPAM, RCS, DNR, FFS, FAS, iRACDB2, GDW, SFMC, Data Platform, RECVUE, CARISMA,) using Informatica Analyst.
  • Data Profiling, Scorecards and Understanding data gaps and Interaction with Data Stewards and Business SME’s and apply Business and Data transformation rules.
  • Data Cleansing, Data Validation, Data Standardization, Customer Address Validation using Address Doctor and migrate the cleansed data to Salesforce (Sales & Marketing Cloud), AWS3 Buckets, Data Platform HIVE DB and Mule Soft, downstream applications using Informatica Power Center under AWS and Informatica Intelligence Cloud Services (IICS).
  • Preparation of ETL Technical specifications and identification of GAP's and get it reviewed by Data Architect's, Business SME's and proper sign off and developer handover.
  • Using Informatica Power Exchange connector to read DB2 Mainframe Sales system data and load into Salesforce (Sales & Marketing Clouds) using Salesforce Connector.
  • Bulk Data loads and Deletion in Salesforce objects by using automated informatica ETL Jobs.
  • Understanding GDW Teradata BTEQ scripts and enhancements based on new requirements and providing required Salesforce objects names and its data types under change list of Salesforce Agile accelerator and interaction with Salesforce team.
  • All the 3rd party source connectivity via MFT (CLEO) mount point – Automation of files pickup and drop files from/to MFT server and writing data to different AWS3 buckets and data platform HIVE db.
  • ETL code development, Unit testing code peer review by Data Architects and proper hand over to QA team and then code deployment across the environments. Preparation of Technical design document. Post production and warranty.
  • Defect analysis and finding proper root cause and apply the code fixes, testing and code re deployment and proper closure of defects in HP ALM.

Skills / Knowledge required:

  • Expert designer, coder and tester for data warehouse programming by using Informatica Power Center, Informatica Developer Tool, Informatica Power Exchange (9.5.1 and Above).
  • Experience in Data profiling, Data Cleansing and Standardization and building reusability logic.
  • Able to Read/Write data from/to different DB’s and applications like – Teradata, SQL Server, DB2, Oracle, Salesforce, AWS S3, DP Hive.
  • Expert in writing complex SQL and PL/SQL blocks for Sybase, Oracle, MSSQL, and UDB/DB2.
  • Experience in design reviews and extensive documentation of standards, best practices, and ETL procedures.
  • Evaluate all functional requirements and map documents and perform troubleshoot on all development processes.
  • Ability to provide work around and fix the technical bugs in the existing processes.
  • Support code migration and deployment for all data migration tasks.
  • Experience in implementing data movement best practices and strategies to build and optimized and high performing solution.
  • Strong experience in coordinating with the Business Analysts to understand business requirement, functional requirements, and conversion of business rules into technical specifications.
  • Proven ability to work independently or in conjunction with a team.
  • Incorporating process changes and updates into the Standard Operation Procedures.

Work location is Portland, ME with required travel to client locations throughout USA.
Rite Pros is an equal opportunity employer (EOE).

Please Mail Resumes to:
Rite Pros, Inc.
415 Congress St, Suite # 201 & 202
Portland, ME 04101