Skip to content

[destination-mssql] no exit code issue #76113

@plovegro

Description

@plovegro

Connector Name

destination-mssql

Connector Version

2.2.15

What step the error happened?

During the sync

Relevant information

During running a large sync (Source = Hubspot, but it does it with lots of sources) i am repeatably getting this error. The sync then starts another attempt and goes round in circles. Some data is being written to the database.
The new attempt then starts with this -

2026-04-07 10:51:22 platform INFO Retry State: RetryManager(completeFailureBackoffPolicy=BackoffPolicy(minInterval=PT10S, maxInterval=PT30M, base=3), partialFailureBackoffPolicy=null, successiveCompleteFailureLimit=5, successivePartialFailureLimit=1000, totalCompleteFailureLimit=10, totalPartialFailureLimit=20, successiveCompleteFailures=0, successivePartialFailures=2, totalCompleteFailures=0, totalPartialFailures=2)
2026-04-07 10:51:22 platform WARN Backing off for: 0 seconds.

The Active streams section also shows some strange values -

Image

Database at time of screenshot had 400k rows.

Relevant log output

2026-04-07 10:28:44 replication-orchestrator WARN Failed to wait for exit value file /dest/exitCode.txt to be found.
2026-04-07 10:29:44 replication-orchestrator WARN Failed to wait for exit value file /source/exitCode.txt to be found.
2026-04-07 10:29:44 replication-orchestrator INFO Closing StateCheckSumCountEventHandler
2026-04-07 10:29:44 replication-orchestrator INFO sync summary: {"status" : "failed",
  "recordsSynced" : 30528,
  "bytesSynced" : 20969590,
  "startTime" : 1775520644057,
  "endTime" : 1775521784699,
  "totalStats" : {
    "bytesCommitted" : 20969590,
    "bytesEmitted" : 458247829,
    "destinationStateMessagesEmitted" : 1,
    "destinationWriteEndTime" : 0,
    "destinationWriteStartTime" : 1775520644154,
    "meanSecondsBeforeSourceStateMessageEmitted" : 0,
    "maxSecondsBeforeSourceStateMessageEmitted" : 1,
    "maxSecondsBetweenStateMessageEmittedandCommitted" : 152,
    "meanSecondsBetweenStateMessageEmittedandCommitted" : 152,
    "recordsEmitted" : 982068,
    "recordsCommitted" : 30528,
    "recordsFilteredOut" : 0,
    "bytesFilteredOut" : 0,
    "replicationEndTime" : 1775521784697,
    "replicationStartTime" : 1775520644057,
    "sourceReadEndTime" : 0,
    "sourceReadStartTime" : 1775520644155,
    "sourceStateMessagesEmitted" : 1
  },
  "streamStats" : [ {
    "streamName" : "email_events",
    "stats" : {
      "bytesCommitted" : 20969590,
      "bytesEmitted" : 458247829,
      "recordsEmitted" : 982068,
      "recordsCommitted" : 30528,
      "recordsFilteredOut" : 0,
      "bytesFilteredOut" : 0
    }
  } ],
  "performanceMetrics" : {
    "processFromSource" : {
      "elapsedTimeInNanos" : 5852570696,
      "executionCount" : 982071,
      "avgExecTimeInNanos" : 5959.417084915449
    },
    "readFromSource" : {
      "elapsedTimeInNanos" : 1018918770091,
      "executionCount" : 982122,
      "avgExecTimeInNanos" : 1037466.5979287706
    },
    "processFromDest" : {
      "elapsedTimeInNanos" : 1651634,
      "executionCount" : 1,
      "avgExecTimeInNanos" : 1651634.0
    },
    "writeToDest" : {
      "elapsedTimeInNanos" : 10222280723,
      "executionCount" : 981102,
      "avgExecTimeInNanos" : 10419.182432611491
    },
    "readFromDest" : {
      "elapsedTimeInNanos" : 1019731145703,
      "executionCount" : 122252,
      "avgExecTimeInNanos" : 8341222.603335733
    }
  }
}2026-04-07 10:29:44 replication-orchestrator INFO failures: [ {
  "failureOrigin" : "destination",
  "internalMessage" : "Destination process message delivery failed",
  "externalMessage" : "Something went wrong within the destination connector",
  "metadata" : {
    "attemptNumber" : 0,
    "jobId" : 139039,
    "connector_command" : "write"
  },
  "stacktrace" : "io.airbyte.workers.internal.exception.DestinationException: Destination process message delivery failed\n\tat io.airbyte.workers.general.BufferedReplicationWorker.writeToDestination(BufferedReplicationWorker.java:454)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsyncWithTimeout$5(BufferedReplicationWorker.java:244)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\nCaused by: java.io.IOException: Broken pipe\n\tat java.base/sun.nio.ch.UnixFileDispatcherImpl.write0(Native Method)\n\tat java.base/sun.nio.ch.UnixFileDispatcherImpl.write(UnixFileDispatcherImpl.java:65)\n\tat java.base/sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:137)\n\tat java.base/sun.nio.ch.IOUtil.write(IOUtil.java:102)\n\tat java.base/sun.nio.ch.IOUtil.write(IOUtil.java:72)\n\tat java.base/sun.nio.ch.FileChannelImpl.write(FileChannelImpl.java:300)\n\tat java.base/sun.nio.ch.ChannelOutputStream.writeFully(ChannelOutputStream.java:68)\n\tat java.base/sun.nio.ch.ChannelOutputStream.write(ChannelOutputStream.java:105)\n\tat java.base/sun.nio.cs.StreamEncoder.writeBytes(StreamEncoder.java:309)\n\tat java.base/sun.nio.cs.StreamEncoder.implWrite(StreamEncoder.java:381)\n\tat java.base/sun.nio.cs.StreamEncoder.implWrite(StreamEncoder.java:357)\n\tat java.base/sun.nio.cs.StreamEncoder.lockedWrite(StreamEncoder.java:158)\n\tat java.base/sun.nio.cs.StreamEncoder.write(StreamEncoder.java:139)\n\tat java.base/java.io.OutputStreamWriter.write(OutputStreamWriter.java:219)\n\tat java.base/java.io.BufferedWriter.implFlushBuffer(BufferedWriter.java:178)\n\tat java.base/java.io.BufferedWriter.flushBuffer(BufferedWriter.java:163)\n\tat java.base/java.io.BufferedWriter.implWrite(BufferedWriter.java:334)\n\tat java.base/java.io.BufferedWriter.write(BufferedWriter.java:313)\n\tat java.base/java.io.Writer.write(Writer.java:278)\n\tat io.airbyte.workers.internal.VersionedAirbyteMessageBufferedWriter.write(VersionedAirbyteMessageBufferedWriter.java:39)\n\tat io.airbyte.workers.internal.LocalContainerAirbyteDestination.acceptWithNoTimeoutMonitor(LocalContainerAirbyteDestination.kt:137)\n\tat io.airbyte.workers.internal.LocalContainerAirbyteDestination.accept(LocalContainerAirbyteDestination.kt:96)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.writeToDestination(BufferedReplicationWorker.java:439)\n\t... 5 more\n",
  "timestamp" : 1775521664087
}, {
  "failureOrigin" : "replication",
  "internalMessage" : "No exit code found.",
  "externalMessage" : "Something went wrong during replication",
  "metadata" : {
    "attemptNumber" : 0,
    "jobId" : 139039
  },
  "stacktrace" : "java.lang.IllegalStateException: No exit code found.\n\tat io.airbyte.workers.internal.ContainerIOHandle.getExitCode(ContainerIOHandle.kt:101)\n\tat io.airbyte.workers.internal.LocalContainerAirbyteDestination.getExitValue(LocalContainerAirbyteDestination.kt:118)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.readFromDestination(BufferedReplicationWorker.java:496)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsync$2(BufferedReplicationWorker.java:216)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\n",
  "timestamp" : 1775521664086
}, {
  "failureOrigin" : "source",
  "internalMessage" : "Source process read attempt failed",
  "externalMessage" : "Something went wrong within the source connector",
  "metadata" : {
    "attemptNumber" : 0,
    "jobId" : 139039,
    "connector_command" : "read"
  },
  "stacktrace" : "io.airbyte.workers.internal.exception.SourceException: Source process read attempt failed\n\tat io.airbyte.workers.general.BufferedReplicationWorker.readFromSource(BufferedReplicationWorker.java:376)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsyncWithHeartbeatCheck$3(BufferedReplicationWorker.java:223)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\nCaused by: java.lang.IllegalStateException: No exit code found.\n\tat io.airbyte.workers.internal.ContainerIOHandle.getExitCode(ContainerIOHandle.kt:101)\n\tat io.airbyte.workers.internal.LocalContainerAirbyteSource.getExitValue(LocalContainerAirbyteSource.kt:89)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.readFromSource(BufferedReplicationWorker.java:356)\n\t... 5 more\n",
  "timestamp" : 1775521664688
}, {
  "failureOrigin" : "replication",
  "internalMessage" : "io.airbyte.workers.exception.WorkerException: Destination has not terminated.  This warning is normal if the job was cancelled.",
  "externalMessage" : "Something went wrong during replication",
  "metadata" : {
    "attemptNumber" : 0,
    "jobId" : 139039
  },
  "stacktrace" : "java.lang.RuntimeException: io.airbyte.workers.exception.WorkerException: Destination has not terminated.  This warning is normal if the job was cancelled.\n\tat io.airbyte.workers.general.BufferedReplicationWorker$CloseableWithTimeout.lambda$close$0(BufferedReplicationWorker.java:548)\n\tat io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsyncWithTimeout$5(BufferedReplicationWorker.java:244)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\nCaused by: io.airbyte.workers.exception.WorkerException: Destination has not terminated.  This warning is normal if the job was cancelled.\n\tat io.airbyte.workers.internal.LocalContainerAirbyteDestination.close(LocalContainerAirbyteDestination.kt:64)\n\tat io.airbyte.workers.general.BufferedReplicationWorker$CloseableWithTimeout.lambda$close$0(BufferedReplicationWorker.java:546)\n\t... 5 more\n",
  "timestamp" : 1775521724695
} ]
2026-04-07 10:29:44 replication-orchestrator INFO 
2026-04-07 10:29:44 replication-orchestrator INFO ----- END REPLICATION -----
2026-04-07 10:29:44 replication-orchestrator INFO 
2026-04-07 10:29:44 replication-orchestrator INFO Returning output...
2026-04-07 10:29:44 platform INFO Closing Segment analytics client...
2026-04-07 10:29:44 platform INFO Waiting for Segment analytic client to flush enqueued messages...
2026-04-07 10:29:44 platform INFO Segment analytic client flush complete.
2026-04-07 10:29:44 platform INFO Segment analytics client closed.  No new events will be accepted.
2026-04-07 10:29:45 platform INFO 
----- START POST REPLICATION OPERATIONS -----

2026-04-07 10:29:45 platform INFO No post-replication operation(s) to perform.
2026-04-07 10:29:45 platform INFO 
----- END POST REPLICATION OPERATIONS -----

2026-04-07 10:29:46 platform INFO Retry State: RetryManager(completeFailureBackoffPolicy=BackoffPolicy(minInterval=PT10S, maxInterval=PT30M, base=3), partialFailureBackoffPolicy=null, successiveCompleteFailureLimit=5, successivePartialFailureLimit=1000, totalCompleteFailureLimit=10, totalPartialFailureLimit=20, successiveCompleteFailures=0, successivePartialFailures=1, totalCompleteFailures=0, totalPartialFailures=1)
 Backoff before next attempt: 0 seconds

Contribute

  • Yes, I want to contribute

Internal Tracking: https://github.com/airbytehq/oncall/issues/11899

Metadata

Metadata

Assignees

No one assigned

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions