Quantcast
Channel: Attunity Integration Technology Forum
Viewing all 411 articles
Browse latest View live

How to allow other user access replicate Express instead of admin account.

$
0
0
Issue: How to allow other user access replicate Express instead of admin account. Thanks.




From:
Hao Zhang <Hao.Zhang@EMCIns.com>
Sent: Friday, February 9, 2018 1:57 PM
To: Reza S. Khan; Replicate Express
Cc: Carl Soto
Subject: RE: Cannot login Attunity express replicate

I follow the documents here, but not working




Example:
"login_pam_libpam_full_path":"/lib64/libpam.so.0",
"login_pam_service_name": "system-auth"
}


Note to save the file in UTF-8 format:
Below I have user bob, roger, administrator, these user are local user within the REPLICATE1 server.




<?xml version="1.0" encoding="utf-8"?>
<UserConfiguration xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" revision="3">
<Role name="Admin" anonymous="false">
<UserRef name="REPLICATE1\bob" />
<UserRef name="REPLICATE1\Administrator" />
<UserRef name="REPLICATE1\roger" />
<GroupRef name="REPLICATE1\AttunityReplicateAdmins" useSamAccountName="false" />
</Role>

Hao Zhang
Operating Systems Analyst
Corporate Office | IT Technical Support
515-345-7623
Hao.Zhang@EMCIns.com

From: Hao Zhang
Sent: Friday, February 9, 2018 12:54 PM
To: 'Reza S. Khan' <Reza.Khan@attunity.com>; Replicate Express <Replicate.Express@attunity.com>
Cc: Carl Soto <Carl.F.Soto@EMCIns.com>
Subject: RE: Cannot login Attunity express replicate

Hi Reza,

admin works. How do I provide my co-worker permission? I do not want give them admin permission. Thanks.

Meteor app on AWS Lambda

$
0
0
Hi,Can anyone tell me a easy process for deploying a Meteor app from the local development machine to AWS Lambda? and would it work fine?

Attunity Microsoft Connector for Oracle in combination with Windows Server 2016?

$
0
0
Hi guys,

I have a few questions.

My colleague and I have received our new development server, which has the following specs:
Windows Server 2016, SQL Server 2016 Enterprise Edition, Visual Studio 2017, Oracle Client 32 and 64 bit installed.

We are trying to connect in SSIS to an Oracle DB with the Attunity 4.0 and 5.0 driver (I read that both versions needed to be installed to be noticeable in Visual Studio).
In Visual Studio our package runs fine, but when we run it via the SQL Server Agent, we get the following error:
The version of Oracle Source is not compatible with this version of the DataFlow.

I noticed in the specs of the Attunity driver, that it supports up to Windows Server 2012 R2.
Can this be the reason that the SQL Server Agent doesn't run our package?
If so, when will a driver be released which also support Windows Server 2016?
Or is there something else that we do wrong on our Server and in the SQL Server Agent?


Thanks in advance,
Sven

Failed to establish and ODBC connection with the database server after restaged PC

$
0
0
My PC was restaged and I've upgraded the versions of Visual Studio to 2017, SSDT to 15.1.xx, SSIS designer to 14.0.xx, TTU:
Version Display Name
--------------------------------------------------------------------------------------------------------------------
14.10.00.03 Teradata Visual Explain 14.10.0.3
15.10.01.00 Teradata GSS Client nt-x8664 15.10.1
16.10.0.0 .NET Data Provider for Teradata 16.10
16.10.00.00 Shared ICU Libraries for Teradata 16.10
16.10.00.00 Shared ICU Libraries for Teradata nt-x8664 16.10
16.10.00.00 Teradata Named Pipes Access Module 16.10
16.10.00.00 Teradata Named Pipes Access Module nt-x8664 16.10
16.10.00.00 Teradata OLE DB Access Module 16.10
16.10.00.00 Teradata OLE DB Access Module nt-x8664 16.10
16.10.00.02 Teradata BTEQ 16.10.0.2
16.10.00.02 Teradata FastLoad 16.10.0.2
16.10.00.02 Teradata SQL Assistant 16.10.0.2
16.10.00.03 ODBC Driver for Teradata 16.10.0.3
16.10.00.03 ODBC Driver for Teradata nt-x8664 16.10.0.3
16.10.00.03 Teradata Data Connector 16.10.0.3
16.10.00.03 Teradata Data Connector nt-x8664 16.10.0.3
16.10.00.03 Teradata FastExport 16.10.0.3
16.10.00.03 Teradata MultiLoad 16.10.0.3
16.10.00.03 Teradata Parallel Transporter Base 16.10.0.3
16.10.00.03 Teradata Parallel Transporter Base nt-x8664 16.10.0.3
16.10.00.03 Teradata Parallel Transporter Stream 16.10.0.3
16.10.00.03 Teradata Parallel Transporter Stream nt-x8664 16.10.0.3
16.10.00.03 Teradata TPump 16.10.0.3
16.10.00.05 Teradata CLIv2 16.10.0.5
16.10.00.05 Teradata CLIv2 nt-x8664 16.10.0.5
16.10.00.05 Teradata GSS Administration Package nt-i386 16.10.0.5
16.10.00.05 Teradata GSS Administration Package nt-x8664 16.10.0.5
16.10.05.00 Teradata Tools and Utilities - Base 16.10.05

and Attunity connectors 5.0. My OS is Microsoft Windows 7 Enterprise SP1, 64 bit.
We're running Teradata Database 15.10.

In my existing SSIS package, my connectors using the .NET data provider for Tdata works fine.

My MSTera connection manager returns an ODBC error:
Failed to establish an ODBC connection with the database server. Verify that the Teradata ODBC Driver for Windows x86 is installed properly. SqlState = IM002 Message = [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified
at Microsoft.SqlServer.Dts.Runtime.ConnectionManager. AcquireConnection(Object txn)
at Attunity.IntegrationServices.DataFlowUI.TeraDataUI .TeraConnectionDialog.testConBtn_Click(Object sender, EventArgs e)


I can connect using SQLA via ODBC and using linked tables in MSAccess via ODBC.

What do I need to change to get my SSIS package to work with the Attunity connector so I can use a Tdata destination data flow object?

Thanks for your help.

fyi, I've tried setting the project Run64BitRuntime property to both True and False, and changed the project TargetServerVersion property to SQL Server 2017.

Kinesis target endpoint

$
0
0
Hi!,
Will the Attunity AWS ami box include at some point in time Kinesis support as target endpoint? I was skimming through the website and seems to be supported.

Thanks,

Andres

Redshift target had a transient disk full

$
0
0
Hi,
During the weekend we had a transient spike in disk usage (100%) on the Redshift target, some of the tables/S3 files could not be loaded for an hour:

Name:  Screen Shot 2018-02-27 at 10.03.41 am.png
Views: 2
Size:  81.8 KB

How Attunity would handle this..will retry until is able to load again and then continue...or will eventually skips those rows?

Thanks,

Andres
Attached Images
 

Attunity Replicate Hourly Console

$
0
0
I have built two Attunity Replicate (6.0.0.238) Hourly AWSEC2 instances. One of them I can get into the Attunity Console without issue.The other I cannot. I have tried local admin and my domain account andboth state:
Youare not authorized to use the Attunity Replicate console.
Can you please assist?

Order of rows returned from RMS when primary index allows duplicate keys

$
0
0
Hi (Hein? U there?)

We're using Attunity AIS with an RMS datasource reading an indexed file that allows duplicate keys in the primary index. When we access the table via ODBC, if two records have the same index value, they don't necessarily come back in the same order they appear in the file. Is there a way to force the datasource to return the records in their natural order? Maybe a checkbox click in Attunity studio? Would describing the index in the metadata config help?

ACF2 commands to complete Attunity Replicate installation

$
0
0
Our customer has a DB2 mainframe using ACF2 as Security software Package rather than RACF. Does anyone know the equivalent ACF2 commands to complete the installation of Attunity Software on their mainframe environment?

Netezza to Azure SQL DW Load

$
0
0
What would be considered a 'large' vs a 'normal' sized table when loading from Netezza to SQL DW, in terms of load performance?

Is there is a tipping point for very large tables?

Filtering of Oracle CDC Source

$
0
0
Currently I have a set of Oracle source tables that I am attempting to filter using the "Record Selection Condition" option in the software located in the Table Settings of the Task Designer View. In the filter tab I have placed an expression condition in the "Record Selection Condition" field that reads:

$INBOUND_OUTBOUND_INDICATOR IS 'O' AND $LAST_UPDATED_DTTM >= datetime('now','-30 days')

Indicating that I wish to only track changes for and replicate records who's columns have an INBOUND_OUBOUND_INDICATOR = 'O' and a timestamp from 30 days ago or less.

My question is, Now what happens in a scenario when a record that did not ORIGINALLY meet this "Record Selection Condition" on initial load is later updated so it NOW does meets the criteria? Is it then inserted into the target and updates are resumed until it no longer meets the criteria?

Also, What happens to a record that originally met this criteria and later is updated to not fall under this "Record Selection Condition" ? Will I have to manually clean these up outside Replicate?

I already have a support ticket in for this that is in status "Researching" but thought I would turn to the community to see if I could get answers.



  • Attunity Replicate 5.0.2
  • Source: Oracle 11.x
  • Target: SQL Server 2012



CDC - Oracle Logical Standby Support?

$
0
0
I am trying to capture change data from an Oracle Logical Standby database but after starting the service it goes into LOGGER status and has the following messages in the log file. I've seen a few documents where it states that CDC is only supported using Oracle physical standby but those documents are for the Replicate product, therefore my question is whether it is possible to configure Change Data Capture for Oracle by Attunity to use an Oracle logical standby."3/13/2018 2:10:54 PM","ERROR","ServerXX","SUSPENDED","LOGGER","ORACD C514E:Failed to add redo log with sequence 843.","source","","""3/13/2018 2:10:54 PM","TRACE","ServerXX","SUSPENDED","LOGGER","ORACD C000T:Error encountered at set position - EOF simulated","source","","""3/13/2018 2:10:54 PM","ERROR","ServerXX","SUSPENDED","LOGGER","ORACD C511E:The Oracle CDC failed to position.","source","",""ThanksStuart

SSIS doesnt Transfer any data while using Attunity Connectors V5, V4, V3.

$
0
0
Env :
VS : 2015 Community
Data Tools build : 14 (Latest for VS 2015)
SSIS Target Server : 2012
Sql Server : 2016
Attunity Connectors installed : V5.0 (32 bit & 64 bit) & V4.0 (32 bit & 64 bit) & V3.0 (32 bit & 64 bit)

When I run my SSIS Through VS 2015, local.
Issues :
1. No Data Transfer Happening (None of SSIS Tasks gets executed inside of SSIS)
2. SSIS execution is Completed with output "<Package path name>Starting and <Package path name>finished: Canceled." ( No Progress log).
3. Not Sure Where/what exactly I need to look into to get this work out.

I feel Struck :(

Getting a rejecting new connections error

$
0
0
I'm getting an error and I don't know where to go to resolve it:

Server has 573 running processes. RepliWeb server configuration is set to reject new connections at 500.

I can't find anything in the user manual either.

A quick way to add/remove/display a job's scheduled run times

$
0
0
I tried with powershell but for some reason it looks like the only way to interact with repliweb using powershell is to read in the entire list of jobs into a variable every time, and with 1800+ jobs it takes almost 2 full minutes to read in every job. I also looked at the CLI that comes with repliweb, but that was even more confusing to try and script with.

Has anyone figured out a quick way to do what I'm trying to do? I just want something that I can run to add a window to a job's schedule. We have developers that push out code at times requested by them, and it's always the same jobs, but sometimes it's different times, and I'd like to just be able to run a script and have it update the job. I'd also like to be able to remove the schedule'd time after the developers have run their jobs with another script.

Any help would be greatly appreciated, thanks!

-Erik

Source Endpoint: Dynamic Unstructured Embedded XML and Parquet files

$
0
0
Hi

I Need to understand if Attunity Replicate can support "Dynamic Unstructured Embedded XML and Parquet files".
Please let me know if i need to share more specific details related to this.

Thanks

automating Attunity driver install

$
0
0
This post is going to be a shot in the dark, but hopefully someone out there can help.

I'm attempting to automate the installation of the Attunity (v3.0) Oracle drivers on Windows 2012R2 (SQL2014) and Windows 2016 (SQL2016) platforms and running into odd behavior. The installer MSI doesn't behave like normal MSIs in that the installer presents a popup about restarting the SSIS service after installation. The problem is that the usual '/q' or '/qn' arguments to perform a silent install don't suppress this popup and without satisfying this popup (clicking either yes/no), the install is put in a suspended state.

I started digging into the MSI to see if there may be a property that suppresses this popup, but so far I haven't found anything useful.

In this scenario, I'm using puppet as the automation tool and a normal execution line like "msiexec /i 'path/to/msi/' /qn" results in an infinite puppet run since that popup doesn't get an answer (and isn't suppressed).

I have tried a few messy hacks like spinning off an new msiexec process and letting it run without confirming success. While this seems to get the driver onto the server, it still leaves the installation suspended and prevents the use of msiexec for future installations.

Anyone out there run into anything similar and found a way around this?

Long running queries - excessive size on overflow hash.

$
0
0
Can anyone explain how Attunity Connect returns rows from a query against RMS and RDB datasources? Does it first collect all rows to memory? spill over to disk? and then return the data? Can it stream for simple queries such as a basic "select * from table." We're trying to ETL (via Informatica PE) all rows from a RDB datasource and it hangs and we are observing a large hash overflow file getting created. Should we paginate the query? maybe every 1000 rows and walk down the table that way? It's also possible the hash file is getting created from a different query against RMS datasources with a join. Can't really tell why/where this file is growing so large. What's the story behind the hash/overflow.

Status goes from "Idle" to "LOGGER" with errors: ORACDC514E, ORACDC511E

$
0
0
Source system:
Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options


Destination System:
Windows Server 2012 R2
SQL Server 2017 x64 Developer Edition (cumulative update 5)
Oracle Instant Client 12.2.0.1 x64 (ODAC used for installation)
Oracle Instant Client 12.1.0.2 x32 (ODAC xcopy used for installation)
Microsoft Connector for Oracle by Attunity 5.0
Attunity Oracle CDC Designer (from SQL Server 2017 feature pack)
Attunity Oracle CDC Service (from SQL Server 2017 feature pack)
Hi,

I have setup a CDC Service and I have setup/configured a CDC Instance to the Oracle database where all steps were passed / green (no indication of errors)


I can access the oracle database using sqlplus and query the tables.


User running the CDC service is part of sysadm (during testing periode) and is also granted full permissions on the Oracle client directory.


Problem appears as soon as I start the Oracle CDC Instance, then it goes from "Idle" to LOGGER
(I have read all the forum threads similar do my problem, but without finding a solution)


I have asked the dba to clear the redo log, but didn't do anything.


This is really a showstopper for us atm, so all help will be highliy appreciated.


Here are the log:
Attached Files

Apache Kudu as a target endpoint (via Attunity ODBC)

$
0
0
Although Attunity supports replication to HDFS and Hive, both of these targets fail to work with DDL changes and updates/deletes in the data because of their nature. Hive is getting there but we are not using Hortonworks so we cannot use that functionality. What we can use is Kudu and although there is no endpoint, it would be an option to leverage the Impala ODBC to speak SQL to Kudu. This would require setting up a ODBC endpoint in Attunity and probably some advanced settings to bridge possible gaps in SQL statements.

Did anyone try this option? Or did anyone try to connect via ODBC to some other non-supported database and therefore has some insights to share?
Viewing all 411 articles
Browse latest View live