任何人都可以给我一些建议,如何解决Python中实现了时间延迟的ODE?我似乎无法弄清楚如何使用 scipy.integrate.odeint 来做到这一点。我正在寻找的内容应该如下所示:
# the constants in the equation
b = 1/50
d = 1/75
a = 0.8
G = 10 ** (-2)
tau = 0.5
u = [b, d, tau, a, G]
# enter initial conditions
N0 = 0.1
No0 = 10
w = [N0, No0]
def logistic(w, t, u):
N, No = w
b, d, tau, a, G = u
dNdt = b * (No(t) - N(t) ) * (N(t) / No(t) ) - d * N(t - tau)
dNodt = G * (a * No(t) - N(t) ) * (N(t) / No(t) )
return [dNdt, dNodt]
# create timescale
# create timescale
stoptime = 1000.0
numpoints = 10000
t = np.linspace(0, stoptime, numpoints)
# in my previous code I would use scipy.integrate.odeint here to integrate my
# equations, but with a time-delay that doesn't work (I think)
soln = ...
其中 N(t)、N(t - tau) 等表示函数的时间参数。是否有一个好的库可以解决这些类型的方程?提前谢谢了!
我是作者JiTCDDE,它可以求解时滞微分方程,并且大部分类似于scipy.ode
。您可以安装它,例如,使用pip3 install jitcdde
。据我所知,其他现有的 Python DDE 库要么已损坏,要么基于已弃用的依赖项。
以下代码将整合您的问题:
from jitcdde import t, y, jitcdde
import numpy as np
# the constants in the equation
b = 1/50
d = 1/75
a = 0.8
G = 10**(-2)
tau = 0.5
# the equation
f = [
b * (y(1) - y(0)) * y(0) / y(1) - d * y(0, t-tau),
G * (a*y(1) - y(0)) * y(0) / y(1)
]
# initialising the integrator
DDE = jitcdde(f)
# enter initial conditions
N0 = 0.1
No0 = 10
DDE.add_past_point(-1.0, [N0,No0], [0.0, 0.0])
DDE.add_past_point( 0.0, [N0,No0], [0.0, 0.0])
# short pre-integration to take care of discontinuities
DDE.step_on_discontinuities()
# create timescale
stoptime = 1000.0
numpoints = 100
times = DDE.t + np.linspace(1, stoptime, numpoints)
# integrating
data = []
for time in times:
data.append( DDE.integrate(time) )
本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系:hwhale#tublm.com(使用前将#替换为@)