How to define LTI systems with Time delay in Scipy?

The transfer function of an LTI system with time delay has a numerator term exp(-Td * s) where Td is the time delay. In Matlab, one could create such an LTI system in many ways (eg using the "s" operator and setting the exponential term directly or by setting the inputdelay outputdelay properties of tf objects.) However, I cannot find any way to do this in Scipy Signal LTI objects. I also checked the Python Control Systems Library, but still couldn't find a way.

I do not want to use the Pade approximation for time delay and want to set the exact time delay to the LTI system.

Does anyone know how to achieve this in Scipy or in any other external Python library?


I checked out the ltisys module at github and attempted to create a LTI class with time delay. I think, it should be straightforward to introduce a input time delay in the state equation , if we replace BU(t) by BU(t-Td) where Td is the time delay. Following approach works for single input single output system. May not be free of bugs , but it solved my purpose.

#Inherit the parent LTI class  to create LTI class with time delay 


class ltidelay(lti):
    def __init__(self,inputdelay,*args,**kwargs):
        super(ltidelay,self).__init__(*args,**kwargs)    
        self.d =inputdelay

#define a method to simulate LTI with time delay . just copied lsim2 and made 2 changes. 1. passed the delay from the `ltidelay` object and 2. modified the state equation.


def lsim3(system , U=None, T=None,X0=None, **kwargs):
    if isinstance(system,lti):
        sys = system
    else:
        sys = lti(*system)
    delay = sys.d
    if X0 is  None:
        X0 = zeros(sys.B.shape[0],sys.A.dtype)        
    if T is None:
        T = linspace(0,10,101)
    T = atleast_1d(T)
    if len(T.shape) != 1:
        raise ValueError("T must be a rank1 array")
    if U is not None:
        U = atleast_1d(U)
        if len(U.shape)==1:
            U=U.reshape(-1,1)
        sU = U.shape
        if sU[0] != len(T):
            raise ValueError("U must have the same number of rows as elements in T")
        if sU[1] != sys.inputs:
            raise ValueError("The number of inputs in U is not compatible")
        ufunc = interpolate.interp1d(T, U, kind ='linear',axis =0,bounds_error =False)
        def fprime(x,t,sys,ufunc):
            return  dot(sys.A,x)+squeeze(dot(sys.B,nan_to_num(ufunc([t-delay]))))
        xout = odeint(fprime,X0,T,args=(sys,ufunc),**kwargs)
        yout = dot(sys.C,transpose(xout))
    else:
        def fprime(x,t,sys):
            return dot(sys.A,x)
        xout = odeint(fprime,X0,T,args=(sys,),**kwargs)
        yout = dot(sys.C, transpose(xout))
    return T , squeeze(transpose(yout)),xout   

#create an LTI system with delay 10

 tf = ltidelay(10,2,[4,1])

#create a step signal and time vector to simulate the LTI and check


u = linspace(0,0,100)

u[50:100] = 1

 t = linspace(1,100,100)

#check the simulation
y = lsim3(tf,u,t,X0 =0)

plot(y[1])

# compare with  LTI without time delay
y1 =lsim2(tf, u,t, X0=0)

plot(y1[1])

#delay works
链接地址: http://www.djcxy.com/p/63746.html

上一篇: Table与TableViewer

下一篇: 如何在Scipy中定义带延时的LTI系统?