TL;DR: Your robots.txt
is served fine, but Lighthouse can not fetch it properly because its audit can currently not work with the connect-src https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Security-Policy/connect-src directive of of your site’s Content Security Policy, due to a known limitation which is being tracked as issue #4386 https://github.com/GoogleChrome/lighthouse/issues/4386 was fixed in Chrome 92 https://github.com/GoogleChrome/lighthouse/pull/12423.
解释:Lighthouse 尝试获取robots.txt
文件通过从站点根目录提供的文档运行的脚本的方式进行。这是它用来执行此请求的代码(可以在灯塔核心 https://github.com/GoogleChrome/lighthouse/blob/master/lighthouse-core/gather/gatherers/seo/robots-txt.js):
const response = await fetch(new URL('/robots.txt', location.href).href);
如果您尝试从您的站点运行此代码,您会注意到抛出“拒绝连接”错误:
发生此错误的原因是浏览器对您网站提供的标头强制实施内容安全策略限制(为了便于阅读,分为几行):
content-security-policy:
default-src 'self';
script-src 'self' *.google-analytics.com;
img-src 'self' *.google-analytics.com;
connect-src 'none';
style-src 'self' 'unsafe-inline' fonts.googleapis.com;
font-src 'self' fonts.gstatic.com;
object-src 'self';
media-src 'self';
frame-src 'self'
注意connect-src 'none';
部分。每CSP 规范 https://w3c.github.io/webappsec-csp/2/#directive-connect-src,这意味着无法使用脚本接口从所提供的文档中加载任何 URL。在实践中,任何fetch
被拒绝。
由于您配置的方式,此标头由 Next.js 应用程序的服务器层显式发送内容安全策略中间件 https://github.com/helmetjs/csp (from 提交 a6aef0e https://github.com/amitschandillia/proost/commit/a6aef0ef4e23a7836ba57cd9d8198d31bc1de471#diff-5ea216fa13019bfada6d0326e01aa7bbR32):
import csp from 'helmet-csp';
server.use(csp({
directives: {
defaultSrc: ["'self'"],
scriptSrc: ["'self'", '*.google-analytics.com'],
imgSrc: ["'self'", '*.google-analytics.com'],
connectSrc: ["'none'"],
styleSrc: ["'self'", "'unsafe-inline'", 'maxcdn.bootstrapcdn.com'], // Remove unsafe-inline for better security
fontSrc: ["'self'"],
objectSrc: ["'self'"],
mediaSrc: ["'self'"],
frameSrc: ["'self'"]
}
}));
解决方案/解决方法:要解决审计报告中的问题,您可以:
- 等待(或提交)Lighthouse 中的修复
- use the
connect-src 'self'
指令,这会产生允许来自 Next.js 应用程序浏览器端的 HTTP 请求的副作用