4 Replies Latest reply on Nov 2, 2012 5:11 PM by dhenggeler

    Channel in a faulted State when performing a network discovery

    bwicks

      Got a channel in a faulted state when performing a network discovery.  I am creating a Diag now and will upload

      2011-08-24 12:46:30,734 [WorkerProcessMonitor] WARN  SolarWinds.JobEngine.Engine.WorkerProcess - CommunicationException detected
      System.ServiceModel.CommunicationException: There was an error reading from the pipe: The pipe has been ended. (109, 0x6d). ---> System.IO.PipeException: There was an error reading from the pipe: The pipe has been ended. (109, 0x6d).
         at System.ServiceModel.Channels.PipeConnection.FinishSyncRead(Boolean traceExceptionsAsErrors)
         at System.ServiceModel.Channels.PipeConnection.Read(Byte[] buffer, Int32 offset, Int32 size, TimeSpan timeout)
         --- End of inner exception stack trace ---

      Server stack trace:
         at System.ServiceModel.Channels.PipeConnection.Read(Byte[] buffer, Int32 offset, Int32 size, TimeSpan timeout)
         at System.ServiceModel.Channels.DelegatingConnection.Read(Byte[] buffer, Int32 offset, Int32 size, TimeSpan timeout)
         at System.ServiceModel.Channels.StreamedFramingRequestChannel.SendPreamble(IConnection connection, TimeoutHelper& timeoutHelper, ClientFramingDecoder decoder, SecurityMessageProperty& remoteSecurity)
         at System.ServiceModel.Channels.StreamedFramingRequestChannel.StreamedConnectionPoolHelper.AcceptPooledConnection(IConnection connection, TimeoutHelper& timeoutHelper)
         at System.ServiceModel.Channels.ConnectionPoolHelper.EstablishConnection(TimeSpan timeout)
         at System.ServiceModel.Channels.StreamedFramingRequestChannel.StreamedFramingRequest.SendRequest(Message message, TimeSpan timeout)
         at System.ServiceModel.Channels.RequestChannel.Request(Message message, TimeSpan timeout)
         at System.ServiceModel.Dispatcher.RequestChannelBinder.Request(Message message, TimeSpan timeout)
         at System.ServiceModel.Channels.ServiceChannel.Call(String action, Boolean oneway, ProxyOperationRuntime operation, Object[] ins, Object[] outs, TimeSpan timeout)
         at System.ServiceModel.Channels.ServiceChannel.Call(String action, Boolean oneway, ProxyOperationRuntime operation, Object[] ins, Object[] outs)
         at System.ServiceModel.Channels.ServiceChannelProxy.InvokeService(IMethodCallMessage methodCall, ProxyOperationRuntime operation)
         at System.ServiceModel.Channels.ServiceChannelProxy.Invoke(IMessage message)

      Exception rethrown at [0]:
         at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg)
         at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type)
         at SolarWinds.JobEngine.IJobExecutionEngine.GetJobResult(Guid[] jobId)
         at SolarWinds.JobEngine.Engine.WorkerProcessWCFProxy.GetJobResult(Guid[] jobId)
         at SolarWinds.JobEngine.Engine.WorkerProcess.GetJobResult(Guid[] jobIds)

        • Re: Channel in a faulted State when performing a network discovery
          Karlo.Zatylny

          Thanks for finding this.  Created item 79562 in our tracking system to analyze this.

          Was there anything special about this discovery?  Lots of nodes, interfaces?  Were you able to run the discovery after waiting a little while, or did restarting the services help?

            • Re: Channel in a faulted State when performing a network discovery
              bwicks

              Uploaded - 8_24_SolarwindsDiagnostics.zip

              Trying same discovery after a service restart now.

              383 nodes

              ~3700 interfaces

              • Re: Channel in a faulted State when performing a network discovery
                dhenggeler

                I see the same problem, in a different context.

                I am using Orion SDK's PowerShell snapin.

                 

                When I do this, it works:

                PS C:\Users\. . . > Get-SwisData $swis 'SELECT TOP 10 MACAddress,IPAddress.IPAddress,DNSName FROM Orion.UDT.Endpoint

                LEFT JOIN Orion.UDT.IPAddress ON Endpoint.EndpointID=IPAddress.EndpointID  LEFT JOIN Orion.UDT.DNSName ON IPAddress.IPAd

                dress=DNSName.IPAddress'

                 

                When I do this, it works (increased TOP to 100):

                PS C:\Users\. . . > Get-SwisData $swis 'SELECT TOP 100 MACAddress,IPAddress.IPAddress,DNSName FROM Orion.UDT.Endpoint

                LEFT JOIN Orion.UDT.IPAddress ON Endpoint.EndpointID=IPAddress.EndpointID  LEFT JOIN Orion.UDT.DNSName ON IPAddress.IPAd

                dress=DNSName.IPAddress'

                 

                When I do this, it fails with a .NET exception similar to what bwicks described (TOP increased to 1000):

                PS C:\Users\. . . > Get-SwisData $swis 'SELECT TOP 1000 MACAddress,IPAddress.IPAddress,DNSName FROM Orion.UDT.Endpoint

                LEFT JOIN Orion.UDT.IPAddress ON Endpoint.EndpointID=IPAddress.EndpointID  LEFT JOIN Orion.UDT.DNSName ON IPAddress.IPAd

                dress=DNSName.IPAddress'

                Get-SwisData : The communication object, System.ServiceModel.Security.SecuritySessionClientSettings`1+ClientSecurityDup

                lexSessionChannel[System.ServiceModel.Channels.IDuplexSessionChannel], cannot be used for communication because it is i

                n the Faulted state.

                At line:1 char:13

                + Get-SwisData <<<<  $swis 'SELECT TOP 1000 MACAddress,IPAddress.IPAddress,DNSName FROM Orion.UDT.Endpoint LEFT JOIN Or

                ion.UDT.IPAddress ON Endpoint.EndpointID=IPAddress.EndpointID  LEFT JOIN Orion.UDT.DNSName ON IPAddress.IPAddress=DNSNa

                me.IPAddress'

                    + CategoryInfo          : NotSpecified: (:) [Get-SwisData], CommunicationObjectFaultedException

                    + FullyQualifiedErrorId : System.ServiceModel.CommunicationObjectFaultedException,SwisPowerShell.GetSwisData



                ideas?

                 

                 

                Dan H.