Informatica jobs
...privacy-preserving transformations so that multiple parties can collaborate on sensitive datasets without risk. What I expect from you is a focused, hands-on transfer of know-how rather than a generic primer. Walk me through real-world patterns, preferred storage layers, and the way you wire up connectors and workflow engines. If you have battle-tested approaches using platforms such as Apache NiFi, Talend, Informatica—or any equivalent stack—please weave those in, but feel free to recommend superior alternatives if they serve the clean-room model better. Deliverables • 3–4 live or recorded sessions (screen-share) illustrating architecture, pipeline setup, and governance checkpoints • A concise reference document summarising the clean-room componen...
Project Description We need a person to work on ETL development, Informatica IDMC) Work Type Full-time or Part-time 3–5 days per week Around 4 hours per day Remote work Required Skills Informatica PowerCenter & IDMC ETL development SQL Payment Payment based on work More pay if performance is very good
Project Title Data Integration & Visualization Specialist Project Description We need a person to work on ETL development, Informatica IDMC) Work Type Full-time or Part-time 3–5 days per week Around 4 hours per day Remote work Required Skills Informatica PowerCenter & IDMC ETL development SQL Payment Payment based on work More pay if performance is very good
Project Title: Data Integration & Visualization Specialist for ETL, IDMC, Informatica & Qlik Project Description: We are seeking an experienced Data Integration & Visualization Specialist for a project involving ETL development, Informatica, IDMC, and Qlik. The goal of the project is to design, implement, and maintain data pipelines and dashboards for seamless data processing and reporting. Responsibilities: Design and develop ETL pipelines for data extraction, transformation, and loading. Work with Informatica PowerCenter and IDMC to manage and optimize data workflows. Develop and maintain Qlik dashboards and reports for data visualization. Ensure data quality, accuracy, and consistency across systems. Collaborate with project stakeholders to meet r...
...upgrades, or enhancements. Ensure high levels of system availability, performance, and security. Required Skills & Experience Technical Expertise Deep knowledge of Salesforce platform (Sales, Service, Marketing, Experience Clouds) and development. Strong understanding of Apex, Lightning Web Components, SOQL/SOSL, and Salesforce APIs. Experience with integration platforms (MuleSoft, Informatica, Dell Boomi, etc.). Architecture & Design Proven track record in designing large-scale, complex Salesforce solutions. Familiarity with enterprise architecture principles and frameworks. Knowledge of data modeling, security architecture, and performance optimization. Leadership & Communication Strong ability to engage with executives, business users, and technical teams. Excell...
... Help design, build, and optimize ETL pipelines and data workflows. Troubleshoot and resolve performance or integration issues in real-time. Guide candidates through daily deliverables and project requirements. Ensure quality, accuracy, and timely completion of all assigned tasks. Technical Expertise Required: Strong proficiency in: Python, SQL, and Data Modeling. ETL Tools: Airflow, Informatica, AWS Glue, Azure Data Factory, or equivalent. Big Data Technologies: Spark, Hadoop, Hive. Cloud Platforms: AWS, Azure, or GCP (preferably Redshift, Snowflake, Databricks). Additional Skills: CI/CD pipelines, Git, data governance, performance optimization. Qualifications: Minimum 10+ years of experience in Data Engineering, ETL Development, or Data Pipeline Architecture. Stron...
...data pipelines, ETL processes, and data modeling. Optimize and maintain large-scale data systems and workflows. Support in performance tuning, debugging, and data migration activities. Offer guidance on data architecture, best practices, and real-time project execution. Technical Skills Required (any of the following): Programming: Python, SQL, PySpark ETL Tools: Apache Airflow, Talend, Informatica, or similar Cloud Platforms: AWS (Glue, Redshift, S3), Azure (Data Factory, Synapse), or GCP (BigQuery) Databases: PostgreSQL, Snowflake, MySQL, MongoDB Big Data Technologies: Hadoop, Spark, Databricks (preferred) Version Control / CI-CD: Git, Jenkins Ideal Candidate: Has 10+ years of experience in data engineering and related technologies. Strong in troubleshooting, archi...
Work Description: Review Informatica mapping/design documents. Understand the business logic (source → transformation → target). Convert logic into clean, optimized Databricks SQL or PySpark code. Help debug existing notebooks where logic or performance issues occur. All work is performed directly inside Databricks (no external tools). Requirements: Strong hands-on experience with Databricks (SQL & PySpark). Familiarity with Informatica ETL concepts (mappings, expressions, joins, lookups, etc.). Ability to write readable, well-commented code. Available for short-term or on-demand screen-sharing support.
Hello, We are looking for an experienced trainer to deliver a short-term training project on "Informatica Master Data Management (MDM)". Responsibilities: - Conduct focused training sessions on Informatica Master Data Management (MDM) - Create or adapt training material as required - Provide hands-on lab guidance (if applicable) Requirements: - Proven experience in Informatica Master Data Management (MDM) - Prior corporate training experience preferred - Ability to deliver training effectively within a short-term timeline To Apply, Please Share: - Updated CV / Profile - Course contents (TOC) - Daily / Hourly commercial rates - Lab availability & charges (if applicable) - Your availability schedule Looking forward to collaborating with the right exper...
I have an operational Informatica 360 dashboard and now need to extend it with new functionality—specifically, a suite of custom reports centered on product data. The core focus is to let business users slice, filter, and export product-level metrics without leaving the 360 environment. Here’s what I already have in place: • A functioning Informatica 360 installation (on-prem) with standard reporting widgets. • A well-defined product data model that’s fully mastered and up to date. What I need next: • Design and build of tailored report templates that surface key product attributes, performance indicators, and any related lookup data. • Configuration inside Informatica 360 so the new reports render seamlessly in the existing ...
...maintain ETL pipelines using Informatica PowerCenter and Azure Data Factory; Transform and migrate data from source to target systems with a focus on data quality, consistency, and added value; Work closely with analysts, solution designers, testers, and developers to deliver end-to-end migration solutions; Provide advice on migration strategy, technical solutions, and data mapping challenges; Continuously develop your skills and share your knowledge within the team and broader organization. Skills We are looking for someone who combines technical expertise with strong collaboration and communication skills. Your profile includes: Bachelor’s degree in IT or related field; Minimum 2 years of experience in data engineering or ETL development; Proven experience with Infor...
...executing one-off and incremental loads that bring historical data into Databricks with full fidelity and clear audit trails. • Pipeline creation – building robust, reusable workflows that land, transform, and publish refreshed data sets on a schedule we define. Although Databricks is the primary platform, several feeds currently live in AWS (Glue catalog, S3) and a few legacy mappings sit in Informatica. Your solution must therefore stitch together these environments smoothly, handling the transformation logic that bridges AWS and Databricks. Acceptance criteria • All source tables fully migrated and validated in Databricks (row counts, checksums, key sampling). • Jobs parameterised and orchestrated so new data arrives without manual intervention. ...
Hello, I am preparing for the Informatica Cloud Data Quality (CDQ) certification exam. I am looking for someone who can provide me with helpful practice questions, exam dumps, or collections that will support my preparation.
We are looking for a freelance Data Engineer to collaborate on a project in the insurance sector, working 100% remotely. Mandatory Requirements Advanced English (minimum B2 high, ideally C1). 5+ years of experience in data engin...especially Apache Spark. Solid understanding of data modeling (especially Data Vault 2.0) and data warehousing principles. Experience working with Software as a Service (SaaS) models. Ability to plan and manage cloud service components, ensuring proactive management and scalability. Familiarity with DevOps processes. Nice to Have (Plus) Experience with data governance frameworks, such as Informatica. Collaboration Details Industry: Insurance Engagement: Freelance Location: 100% remote Start date: Immediate, joining high-impact projects in t...
...Data Engineer to design, develop, and optimize ETL services and large-scale data integration workflows. The role involves working with diverse data sources including relational databases, APIs, cloud data warehouses, and flat files. Requirements: 5+ years of hands-on experience in ETL development, data engineering, and pipeline automation. Proficiency in Snowflake, SQL, Python, DBT, Airflow, Informatica, Talend. Expertise in OpenSearch/Elasticsearch for indexing, querying, cluster management, and optimization of large datasets. Strong background in data modeling, governance, quality assurance, and compliance. Proven ability to integrate and synchronize data across systems like Salesforce, Segment, Tealium, Quickbase, AiMatch, Ongage, CDPs, DSPs. Strong analytical and problem-sol...
This position is for a senior role, responsible for designing and developing sound solutions in MDM space. Catering to simple/complex realization of different type of master data using “Multidomain MDM, Business 360, Customer 360, Reference 360, Data Integration and other cloud offerings that Informatica has to offer POSITION ACCOUNTABILITIES: • Play a pivotal role in maintenance, support, upgrade of secure agents and the deployed services including data quality, Governance and management of master data. • Help data stewards and business users, in their day to day need when accessing the tool. Liaise with Data Stewards/Business users to ensure that data standards, business rules, workflows and supporting transactions are aligned with the laid-out MDM standards &...
This position is for a senior role, responsible for designing and developing sound solutions in MDM space. Catering to simple/complex realization of different type of master data using “Multidomain MDM, Business 360, Customer 360, Reference 360, Data Integration and other cloud offerings that Informatica has to offer POSITION ACCOUNTABILITIES: • Play a pivotal role in maintenance, support, upgrade of secure agents and the deployed services including data quality, Governance and management of master data. • Help data stewards and business users, in their day to day need when accessing the tool. Liaise with Data Stewards/Business users to ensure that data standards, business rules, workflows and supporting transactions are aligned with the laid-out MDM standards &...
...Supporting cloud migration initiatives, particularly on Azure Cloud Services. Qualifications 8+ years of Software Engineering experience. Bachelor’s degree in Engineering/Computer Science (or equivalent). Strong background in Java development (essential). Hands-on experience in Azure Cloud Services (especially cloud migration). Technical Skills Proficiency with data manipulation tools: SSIS, Informatica. Strong command of development languages: Python, Java, C/C++, HTML, XML, SQL, JSON. Familiarity with Windows/UNIX environments. Experience with Git (GitHub/GitLab). Knowledge of test-driven development and industry best practices. Hands-on experience with Azure/AWS cloud platforms. Ability to adapt to Agile/Waterfall development methodologies. Eagerness to lea...
...Stay updated with emerging technologies and apply best practices. **Qualifications** * **8+ years** of Software Engineering experience. * Bachelor’s degree in **Engineering / Computer Science** or equivalent. * Strong hands-on experience in **Java development**. * **ECL knowledge – Mandatory.** * Hands-on expertise in **Azure** (cloud services, migration projects). * Proficiency with **SSIS, Informatica** for data manipulation. * Strong knowledge of **Python, Java, C/C++, SQL, HTML, XML, JSON**. * Experience in **Windows & UNIX environments**. * Knowledge of **Git (GitHub/GitLab)**, version control, and CI/CD. * Familiarity with **Test-Driven Development (TDD)** and best coding practices. * Strong analytical and problem-solving skills. **Preferred Skills** * E...
I need an IDQ expert for a hands-on, live session on Google Meet. Key topics to cover: - Data profiling - Standardization and normalization - Duplicate elimination - Parsing & Duplicate Detection (Matching) - Data Cleansing Techniques - Consolidation - Exception Handling - Best Practices - Real-life scenarios Ideal Skills: - Proficiency in Informatica Data Quality - Experience with data cleansing processes - Ability to explain complex concepts clearly - Familiarity with flat file data sources (CSV, Excel) - Strong communication skills This session must be recorded
I am an experienced professional with 9 years of expertise in ETL and ELT tools, including Informatica, IICS, Talend, Alteryx, DBT, and Snowflake. I'm eager to collaborate with skilled freelancers on projects involving Snowflake and DBT. My goal is to assist you in saving time and reducing costs while expanding my knowledge of complex Snowflake and DBT concepts. Key Areas of Collaboration: - Snowflake data migration from relational databases - Writing custom transformations in DBT Ideal Skills and Experience: - Proficiency in Snowflake and DBT - Experience with data migration from relational databases - Ability to write and optimize custom transformations in DBT - Strong problem-solving skills and a collaborative mindset If you need assistance in these areas and are open to ...
...I am seeking a skilled professional with experience in SQL Server development and Informatica to create a robust Informatica pipeline. The pipeline will need to perform complex data transformations, including aggregation, normalization, and data enrichment, and will interact with flat file data sources. Key Requirements: - Develop an Informatica pipeline capable of handling complex transformations - Implement data aggregation, normalization, and enrichment processes - Integrate and manage flat file data sources effectively - Ensure data accuracy and integrity throughout the transformation process Ideal Skills and Experience: - Proven experience with SQL Server development - Strong proficiency in Informatica for data transformation tasks - Familiarity with h...
...(ERwin acceptable, preference for ER Studio) Experience in fact/dimension modeling, star/snowflake schemas Strong SQL skills and understanding of metadata, lineage, governance Excellent communication and analytical skills Active LinkedIn profile required Nice-to-Have: Experience with ER Studio Macros, ETL/BI tools, or cloud platforms Familiarity with data governance tools (e.g., Collibra, Informatica)...
I'm looking for an experienced professional to assist with a data migration project using Informatica PowerCenter, IICS, and SQL. The primary goal is to efficiently migrate data from cloud storage platforms. Key Requirements: - Expertise in Informatica PowerCenter and IICS - Strong SQL skills for data manipulation and migration - Experience with cloud storage data migration - Ability to handle large datasets and ensure data integrity Ideal Skills and Experience: - Proven track record in data migration projects - Familiarity with various cloud storage platforms (e.g., Amazon S3, Google Cloud Storage, Azure Blob Storage) - Strong analytical and problem-solving skills - Excellent communication and documentation abilities If you have the necessary skills and experience, I...
I'm seeking an experienced Informatica developer with at least three years of experience to assist with a project focused on data integration, migration, ETL processes, and data warehousing. This project is in its early stages, and the selected freelancer must be physically present in either Bangalore or Tirupati to participate in real-time meetings and provide daily updates. Key Responsibilities: - Assist with data integration and migration tasks - Support ETL process development and execution - Contribute to data warehousing activities - Participate in real-time meetings - Provide daily updates Ideal Skills and Experience: - Proficiency in Informatica Powercenter ETL, Informatica MDM, and Informatica Cloud Services - Strong background in data integratio...
Kindly apply only if you have relevant experience...tools. ● Strong command of SQL for data querying, transformation, and performance tuning. ● Solid understanding of data modeling concepts (e.g., star schema, snowflake schema). Technical Skills: ● Advanced SQL skills including stored procedures and performance optimization. ● Hands-on experience with BI tools such as Tableau, Power BI, and Snowflake. ● Familiarity with ETL tools like SSIS, Informatica, or Azure Data Factory. ● Understanding of data warehousing principles, methodologies, and architecture. Soft Skills: ● Excellent analytical and problem-solving capabilities with keen attention to detail. ● Strong communication and interpersonal skills—able to bridge the gap between technical teams and business stakeholders.
I need help migrating from Informatica to AWS Glue. The primary goal is performance improvement. Currently, we are using cloud databases as our data sources, and we don't require any data transformation during the migration; we just need to move the data. Ideal Skills and Experience: - Expertise in both Informatica and AWS Glue - Experience with cloud databases - Strong understanding of data migration processes - AWS certification is a plus
I'm seeking an experienced Informatica PIM (Product 360) professional to enhance user experience. Key Objectives: - The primary goal is to enhance user experience. - Ideal candidates should have strong knowledge and experience in Informatica Master Data Management (MDM) and Product 360. - Proficiency in product data management, ensuring data accuracy, and improving system usability is essential. Skills, and Experience: - Expertise in Informatica PIM, MDM, and Product 360. - Previous experience in similar projects. - Ability to understand and implement user experience improvements. Looking forward to your expertise to streamline and enhance our product information management system.
I need an experienced Informatica IDMC developer to handle data integration, quality, and governance tasks. The ideal candidate will work with both databases and cloud storage, focusing on batch processing. Key Responsibilities: - Integrate and manage data from various databases and cloud storage. - Ensure data quality and implement data governance practices. - Handle batch processing for data integration tasks. Ideal Skills and Experience: - Proficiency in Informatica IDMC. - Strong experience with database and cloud storage integration. - Knowledge of data quality and governance principles. - Experience with batch processing. Looking for a candidate who can deliver robust data solutions efficiently.
...Databricks, Azure, and Mule. - Experience with both CRM and ERP systems. - Excellent leadership and strategic planning skills. • More than 10 years in IT with a strong focus on integration. • Integration Experience: At least 5 years in integration architecture and leadership roles. • ETL/ELT Tools: Extensive experience with ETL or ELT tools such as Azure Data Factory (ADF), Synapse Pipelines, Informatica, or similar. • App Integration: Proven experience in application integration with real-time data. • CRM & ERP Systems: Hands-on experience with CRM and ERP systems, preferably JDE and Dynamics 365. • Databricks, Azure Synapse Analytics, Delta Lake: Mandatory experience with Databricks, Azure Synapse Analytics, and Delta Lake. &bull...
I need assistance with a project involving a 3-year Informatica developer. This project which is in it's early stage of development entails data integration, data migration, ETL processes, and data warehousing. The selected freelancer will participate in real-time meetings and provide daily updates. The preferred location is either Bangalore or Tirupati and the person must be present physically in these locations. Key Responsibilities: - Assisting with data integration, migration tasks - Supporting ETL process development and execution - Contributing to data warehousing activities - Participating in real-time meetings - Providing daily updates Ideal Skills and Experience: - Proficient in Informatica and related tools - Strong background in data integration and ETL pro...
I'm seeking a skilled data engineer with expertise in performance tuning and optimization within ETL/Informatica development. The perfect candidate for this project will have a deep understanding of data processing, be able to identify slow processes, and implement solutions to enhance performance. Key Responsibilities: - Conduct performance tuning and optimization of ETL processes. - Identify and troubleshoot performance issues. Ideal Skills: - Proficiency in ETL/Informatica. - Experience with performance tuning and optimization. - Strong problem-solving skills. Please note, while the specific types of data sources and performance issues have not been disclosed, this role will primarily focus on enhancing overall ETL performance.
I'm looking for mid-level ETL/Informatica support for my finance project. Tasks Include (but are not limited to): - Data extraction and loading - Data transformation and cleansing - Error handling and debugging Ideal Skills and Experience: - Proficiency in ETL/Informatica - Experience in the finance industry - Strong problem-solving skills for debugging - Mid-level expertise in data handling and transformation
I'm looking for an experienced Informatica PowerCenter professional who can assist with data transformation, specifically data enrichment. Key Responsibilities: - Utilize Informatica PowerCenter for data transformation tasks. - Focus primarily on data enrichment processes. - Work primarily with databases as the data sources. Ideal Skills: - Proficient in Informatica PowerCenter. - Extensive experience in data transformation and enrichment. - Strong understanding of working with databases.
I'm seeking a professional with deep knowledge in DBT, Informatica, and SQL for an immediate migration project from Informatica to DBT. Key Requirements: - Proficient in working with Oracle, Informatica, DBT, and Redshift. - Strong SQL skills. - Experience with DBT and Informatica. Primary Goal: - The migration aims to achieve improved data integration. Additional Task: - Post-migration, I need cloud storage to be integrated with DBT. Ideal Skills: - Expertise in DBT and Informatica. - Strong understanding of SQL. - Experience with cloud storage integration. - Ability to work under tight deadlines. Immediate assistance is required. Please reach out if you can help.
YOU MUST HAVE KAFKA to show. I'm seeking a KAFKA instructor to conduct hands-on workshops focused on advanced KAFKA concepts for a group of seasoned users. The training should cover: - Stream processing: Delving into KAFKA's capabilities for processing streams of data. - Advanced KAFKA configurations: Understanding the intricate configurations that KAFKA offers for optimal performance and customization. - Integrating KAFKA with other systems: Exploring the interoperability of KAFKA with various systems and platforms. Ideal candidates should have extensive experience with KAFKA, particularly its advanced features, and a proven track record of conducting interactive, practical training sessions. Skills in stream processing, advanced configurations, and system integration are cruc...
I'm seeking an experienced Informatica MDM Developer to manage and oversee application integrations primarily with cloud-based platforms like AWS, Azure, and Google Cloud. Key Responsibilities: - Develop critical business Informatica entities using Informatica Intelligent Cloud Services (IICS), with a focus on Cloud Application Integration (CAI) and Cloud Data Integration (CDI). - Oversee the administration and architecture of the Informatica Data Management Cloud (IDMC). - Ensure seamless integration processes through effective management of IDMC application integration components. Requirements: - 7-8 years of hands-on experience in Informatica MDM. - Strong expertise in IDMC, CAI, and CDI. - Proven track record of managing cloud-based application...
I need a skilled professional to convert my Informatica Mappings to SQL queries. The source of my data is a Relational Database, and the queries should be compatible with Redshift. Ideal skills for this job: - Proficiency in SQL, especially with Redshift - Strong understanding of Informatica - Experience with Relational Databases Please note, the typical volume of data that the SQL queries will handle is uncertain at this moment, so the queries should be able to handle flexible data volumes.
I am looking for an Informatica IDQ developer with 3-5 years of experience. The developer will be expected to perform data profiling tasks using the Informatica technology. They should be an expert in this field and be available to fulfill project requirements. The developer should have a proven track record in successfully leading and executing such projects. In addition to the technical abilities required for this job, the developer should have excellent communication skills, analytical aptitude, and an eye for detail. If you think you possess these qualities and have the desired experience then this would be the perfect job for you!
I am looking for an experienced Data Integration Specialist for a 3-month contract, based in Pune or Bangalore. The ideal candidate will have over 6 years of experience, with a strong focus on Informatica PowerCenter, SQL, and Microsoft SQL Server. Key Responsibilities: - Creating and maintaining end-to-end source to target flows using Informatica PowerCenter. - Conducting data analysis and reporting, utilizing SQL expertise. - Basic Unix commands and shell scripting to support tasks. - Direct communication with users to understand and fulfill requirements. Compensation will be based on experience. Strong communication skills are essential, as the role involves independent task completion based on user requirements. Production monitoring and issue handling experience will ...
I'm looking for an expert who can assist with the installation of Informatica Power Center. Please note that I am not sure about the specific version of Informatica Power Center or the operating system it needs to be installed on. Ideal Skills and Experience: - Strong knowledge of Informatica Power Center - Experience in installing Informatica software on various operating systems - Ability to configure Informatica with existing databases (if needed)
I'm seeking an expert in data engin...extracting data from them is important. Ideal Skills and Experience: - Strong background in data engineering - Proven experience in building end-to-end data pipelines - Proficient in SQL and managing SQL databases - Experienced in processing structured data - Skilled in working with APIs - Knowledge of data warehousing concepts would be a plus - Experience with ETL tools like Apache NiFi, Talend, or Informatica. - Practiced in implementing data quality checks and validation techniques. - Familiarity with major cloud services such as AWS, Azure, or Google Cloud for data storage and processing. Please provide examples of similar projects you've worked on in your proposal. Preference for using AWS Glue for the ETL processes and data pi...
...with Web services including SOAP, WSDL, REST, SSL standards, security models and typical API client architecture. Plus experience in working with Platform Events, Bulk API and the Metadata API • Implementation knowledge of Flow, App Builder, Process Builder, validation rule, approval process, reports and dashboards • Experience using Apex Data Loader (or other ETL tools like Informatica or Boomi or Cast Iron or Mulesoft) • Experience with database development, SQL or PL/SQL, database schemas, stored procedures is is a plus. • Follow unit testing and test class best practices, and be capable of coding for positive/negative scenarios • Must have experience with production deployment using CI/CD tools like GitHub, Gitlab, Bamboo, Copado etc, change-set/ecl...
...team members. 4. Monitoring and Control: Continuously monitor data quality and compliance, adjusting strategies as necessary. 5. Closure and Evaluation: Review the project's success and document lessons learned, making adjustments for future projects. To ensure the project's success, I recommend utilizing the following tools and technologies: 1. Data Governance Platforms: Tools like Collibra, Informatica, or IBM Data Governance for centralized data governance. 2. Data Quality Tools: Software such as Talend, SAS Data Management, or DataRobot. 3. Access Control and Security Solutions Tools like Okta, AWS Identity and Access Management (IAM), and Microsoft Azure AD for secure access control. With my expertise in data security and governance, I am confident in my ability ...
I'm seeking a professional to install Informatica PowerCenter 10.1 on a Windows server, using Oracle databases as a data source, primarily for data warehousing. Key Requirements: - Install Informatica PowerCenter 10.1 on a Windows server - Configure Oracle databases as data sources - Set up PowerCenter for data warehousing purposes Ideal candidates for this project should have: - Extensive experience with Informatica PowerCenter installations - Familiarity with Windows server environments - Understanding of data warehousing concepts and requirements - Expertise in Oracle database configuration and integration with Informatica.
I'm seeking an experienced professional in Informatica PowerCenter for data integration tasks. The key focus of the project is to seamlessly integrate various data sources into a singular platform. Your role would involve understanding the intricacies of data flows, ensuring accurate and efficient integration processes. Ideal skills for the job include: - Proficiency in Informatica PowerCenter - Strong understanding of data integration principles - Excellent problem-solving skills - Ability to work with complex data sets Experience with data migration and data warehousing projects would be a plus. Please note that the specific data sources for this project have not yet been determined, so flexibility and adaptability will be crucial. While this project does not current...
...others grow in this rapidly evolving field. Key Responsibilities: Develop and deliver training sessions on the following topics: Apache Spark PySpark FastAPI Spark NLP Databricks or Snowflake Integrations with cloud platforms (AWS, GCP) Data virtualization (Starburst) Data modeling (Apache Iceberg, Parquet, JSON) Data Lakehouse architecture (Spark and Flink) Apache Airflow Oracle GoldenGate, Informatica Flask Framework, Docker, Kubernetes Pandas Control-M for scheduling MLOps in Data Engineering and Machine Learning Models DataOps, Data Observability/Quality/Monitoring (Monte Carlo) Create comprehensive training materials, including presentations, documentation, and hands-on exercises. Engage with trainees through interactive discussions and Q&A sessions. Assess trainee pr...
I'm seeking an experienced ETL tester with a strong background in SQL and Python. The primary focus of this role will be ensuring data accuracy and consisten...experienced ETL tester with a strong background in SQL and Python. The primary focus of this role will be ensuring data accuracy and consistency throughout the ETL process. The ETL tools in use for this project are Informatica, and the ETL process will primarily interact with SQL databases. Key Responsibilities: - Conduct thorough testing to ensure data accuracy and consistency. - Utilize Informatica for ETL processes. - Interact and perform testing on SQL databases. Ideal Skills and Experience: - Proven experience in ETL testing. - Strong SQL and Python skills. - Familiarity with Informatica. - Experienc...
The ideal candidate for our project should have: - Proven expertise in Dell Boomi development, primarily in integration development. - Strong skills and experience in data mapping and error handling. - An ability to streamline business processes, improve data accuracy, and enhance system efficiency. Even though you'll come across some specifics as ...we'll largely require your skills to address those above-mentioned areas. Therefore, you must be able to demonstrate enough proficiency in them. In order to make an informed bid, kindly note we don't have any designated systems or applications that would be integrated. The specific systems to be worked on would be determined as we progress. REQUIREMENT IS ONLY FOR DELL BOOMI DEVELOPER. With informatica experience NEE...
Top informatica Community Articles
Cómo inspirar confianza a mis clientes potenciales
Cómo inspirar confianza a mis clientes potenciales
¿Qué lenguaje de programación Web debo aprender?
¿Qué lenguaje de programación Web debo aprender?