我对一些团体代码有问题,我很确定这些代码曾经运行过(在较旧的 pandas 版本上)。在 0.9 上,我得到没有要聚合的数字类型错误。有任何想法吗?
In [31]: data
Out[31]:
<class 'pandas.core.frame.DataFrame'>
DatetimeIndex: 2557 entries, 2004-01-01 00:00:00 to 2010-12-31 00:00:00
Freq: <1 DateOffset>
Columns: 360 entries, -89.75 to 89.75
dtypes: object(360)
In [32]: latedges = linspace(-90., 90., 73)
In [33]: lats_new = linspace(-87.5, 87.5, 72)
In [34]: def _get_gridbox_label(x, bins, labels):
....: return labels[searchsorted(bins, x) - 1]
....:
In [35]: lat_bucket = lambda x: _get_gridbox_label(x, latedges, lats_new)
In [36]: data.T.groupby(lat_bucket).mean()
---------------------------------------------------------------------------
DataError Traceback (most recent call last)
<ipython-input-36-ed9c538ac526> in <module>()
----> 1 data.T.groupby(lat_bucket).mean()
/usr/lib/python2.7/site-packages/pandas/core/groupby.py in mean(self)
295 """
296 try:
--> 297 return self._cython_agg_general('mean')
298 except DataError:
299 raise
/usr/lib/python2.7/site-packages/pandas/core/groupby.py in _cython_agg_general(self, how, numeric_only)
1415
1416 def _cython_agg_general(self, how, numeric_only=True):
-> 1417 new_blocks = self._cython_agg_blocks(how, numeric_only=numeric_only)
1418 return self._wrap_agged_blocks(new_blocks)
1419
/usr/lib/python2.7/site-packages/pandas/core/groupby.py in _cython_agg_blocks(self, how, numeric_only)
1455
1456 if len(new_blocks) == 0:
-> 1457 raise DataError('No numeric types to aggregate')
1458
1459 return new_blocks
DataError: No numeric types to aggregate
您如何生成数据?
查看输出如何显示您的数据是“对象”类型? groupby 操作首先专门检查每列是否是数字数据类型。
In [31]: data
Out[31]:
<class 'pandas.core.frame.DataFrame'>
DatetimeIndex: 2557 entries, 2004-01-01 00:00:00 to 2010-12-31 00:00:00
Freq: <1 DateOffset>
Columns: 360 entries, -89.75 to 89.75
dtypes: object(360)
look ↑
你是否先初始化一个空的DataFrame,然后填充它?如果是这样,这可能就是为什么它随着新版本而改变的原因,因为之前 0.9 空 DataFrame 被初始化为浮点类型,但现在它们是对象类型。如果是这样你可以将初始化更改为DataFrame(dtype=float)
.
您也可以致电frame.astype(float)
本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系:hwhale#tublm.com(使用前将#替换为@)