使用CFSR驱动WRF的方法

前言:

不论是业务预报,还是中小尺度模拟,
WRF作为一种成熟且强大的工具,
已经成为气象行业内较为常用的方法。
使用不同源的再分析资料为中尺度计算提供初边界条件,
对比不同数据源的质量效果,
也是行业内从业人员比较关心的一个问题,同时也是一个很大的话题。

除了常见了GFS(FNL),ERA-Interim,ERA5资料外,小编还跑通了CFSR数据。下面对该方法简单分享下。

主要分以下几步:

资料下载,资料选择,解码,计算,后处理。

资料下载参考代码:

#!/usr/bin/env python
#################################################################
# Python Script to retrieve 93 online Data files of 'ds094.0',
# total 164.37M. This script uses 'requests' to download data.
#
# Highlight this script by Select All, Copy and Paste it into a file;
# make the file executable and run it on command line.
#
# You need pass in your password as a parameter to execute
# this script; or you can set an environment variable RDAPSWD
# if your Operating System supports it.
#
# Contact dattore@ucar.edu (Bob Dattore) for further assistance.
#################################################################


import sys, os
import requests

def check_file_status(filepath, filesize):
    sys.stdout.write('\r')
    sys.stdout.flush()
    size = int(os.stat(filepath).st_size)
    percent_complete = (size/filesize)*100
    sys.stdout.write('%.3f %s' % (percent_complete, '% Completed'))
    sys.stdout.flush()

# Try to get password
if len(sys.argv) < 2 and not 'RDAPSWD' in os.environ:
    try:
        import getpass
        input = getpass.getpass
    except:
        try:
            input = raw_input
        except:
            pass
    pswd = input('Password: ')
else:
    try:
        pswd = sys.argv[1]
    except:
        pswd = os.environ['RDAPSWD']

url = 'https://rda.ucar.edu/cgi-bin/login'
values = {'email' : 'duanyapeng1@163.com', 'passwd' : pswd, 'action' : 'login'}
# Authenticate
ret = requests.post(url,data=values)
if ret.status_code != 200:
    print('Bad Authentication')
    print(ret.text)
    exit(1)
dspath = 'http://rda.ucar.edu/dsrqst/DUAN391983/'
filelist = [
'dos-wget.391983.bat',
'unix-curl.391983.csh',
'unix-wget.391983.csh',
'201701300000.cdas1.20170129.sfluxgrbl.grb2',
'201701300600.cdas1.20170130.sfluxgrbl.grb2',
'201701301200.cdas1.20170130.sfluxgrbl.grb2',
'201701301800.cdas1.20170130.sfluxgrbl.grb2',
'201701310000.cdas1.20170130.sfluxgrbl.grb2',
'201701310600.cdas1.20170131.sfluxgrbl.grb2',
'201701311200.cdas1.20170131.sfluxgrbl.grb2',
'201701311800.cdas1.20170131.sfluxgrbl.grb2',
'201702010000.cdas1.20170131.sfluxgrbl.grb2',
'201701300000.cdas1.20170129.pgrbl.grb2',
'201701300600.cdas1.20170130.pgrbl.grb2',
'201701301200.cdas1.20170130.pgrbl.grb2',
'201701301800.cdas1.20170130.pgrbl.grb2',
'201701310000.cdas1.20170130.pgrbl.grb2',
'201701310600.cdas1.20170131.pgrbl.grb2',
'201701311200.cdas1.20170131.pgrbl.grb2',
'201701311800.cdas1.20170131.pgrbl.grb2',
'201702010000.cdas1.20170131.pgrbl.grb2',
'201701300000.cdas1.20170129.pgrbf.grb2',
'201701300600.cdas1.20170130.pgrbf.grb2',
'201701301200.cdas1.20170130.pgrbf.grb2',
'201701301800.cdas1.20170130.pgrbf.grb2',
'201701310000.cdas1.20170130.pgrbf.grb2',
'201701310600.cdas1.20170131.pgrbf.grb2',
'201701311200.cdas1.20170131.pgrbf.grb2',
'201701311800.cdas1.20170131.pgrbf.grb2',
'201702010000.cdas1.20170131.pgrbf.grb2',
'201701300000.cdas1.20170129.pgrbh.grb2',
'201701300600.cdas1.20170130.pgrbh.grb2',
'201701301200.cdas1.20170130.pgrbh.grb2',
'201701301800.cdas1.20170130.pgrbh.grb2',
'201701310000.cdas1.20170130.pgrbh.grb2',
'201701310600.cdas1.20170131.pgrbh.grb2',
'201701311200.cdas1.20170131.pgrbh.grb2',
'201701311800.cdas1.20170131.pgrbh.grb2',
'201702010000.cdas1.20170131.pgrbh.grb2',
'201701300000.cdas1.20170129.sfluxgrbf.grb2',
'201701300600.cdas1.20170130.sfluxgrbf.grb2',
'201701301200.cdas1.20170130.sfluxgrbf.grb2',
'201701301800.cdas1.20170130.sfluxgrbf.grb2',
'201701310000.cdas1.20170130.sfluxgrbf.grb2',
'201701310600.cdas1.20170131.sfluxgrbf.grb2',
'201701311200.cdas1.20170131.sfluxgrbf.grb2',
'201701311800.cdas1.20170131.sfluxgrbf.grb2',
'201702010000.cdas1.20170131.sfluxgrbf.grb2',
'201701300000.cdas1.20170129.ipvgrbf.grb2',
'201701300000.cdas1.20170129.ipvgrbh.grb2',
'201701300000.cdas1.20170129.ipvgrbl.grb2',
'201701300000.cdas1.20170129.splgrbf.grb2',
'201701300000.cdas1.20170129.splgrbl.grb2',
'201701300600.cdas1.20170130.ipvgrbf.grb2',
'201701301200.cdas1.20170130.ipvgrbf.grb2',
'201701301800.cdas1.20170130.ipvgrbf.grb2',
'201701310000.cdas1.20170130.ipvgrbf.grb2',
'201701300600.cdas1.20170130.ipvgrbh.grb2',
'201701301200.cdas1.20170130.ipvgrbh.grb2',
'201701301800.cdas1.20170130.ipvgrbh.grb2',
'201701310000.cdas1.20170130.ipvgrbh.grb2',
'201701300600.cdas1.20170130.ipvgrbl.grb2',
'201701301200.cdas1.20170130.ipvgrbl.grb2',
'201701301800.cdas1.20170130.ipvgrbl.grb2',
'201701310000.cdas1.20170130.ipvgrbl.grb2',
'201701300600.cdas1.20170130.splgrbf.grb2',
'201701301200.cdas1.20170130.splgrbf.grb2',
'201701301800.cdas1.20170130.splgrbf.grb2',
'201701310000.cdas1.20170130.splgrbf.grb2',
'201701300600.cdas1.20170130.splgrbl.grb2',
'201701301200.cdas1.20170130.splgrbl.grb2',
'201701301800.cdas1.20170130.splgrbl.grb2',
'201701310000.cdas1.20170130.splgrbl.grb2',
'201701310600.cdas1.20170131.ipvgrbf.grb2',
'201701311200.cdas1.20170131.ipvgrbf.grb2',
'201701311800.cdas1.20170131.ipvgrbf.grb2',
'201702010000.cdas1.20170131.ipvgrbf.grb2',
'201701310600.cdas1.20170131.ipvgrbh.grb2',
'201701311200.cdas1.20170131.ipvgrbh.grb2',
'201701311800.cdas1.20170131.ipvgrbh.grb2',
'201702010000.cdas1.20170131.ipvgrbh.grb2',
'201701310600.cdas1.20170131.ipvgrbl.grb2',
'201701311200.cdas1.20170131.ipvgrbl.grb2',
'201701311800.cdas1.20170131.ipvgrbl.grb2',
'201702010000.cdas1.20170131.ipvgrbl.grb2',
'201701310600.cdas1.20170131.splgrbf.grb2',
'201701311200.cdas1.20170131.splgrbf.grb2',
'201701311800.cdas1.20170131.splgrbf.grb2',
'201702010000.cdas1.20170131.splgrbf.grb2',
'201701310600.cdas1.20170131.splgrbl.grb2',
'201701311200.cdas1.20170131.splgrbl.grb2',
'201701311800.cdas1.20170131.splgrbl.grb2',
'201702010000.cdas1.20170131.splgrbl.grb2']
for file in filelist:
    filename=dspath+file
    file_base = os.path.basename(file)
    print('Downloading',file_base)
    req = requests.get(filename, cookies = ret.cookies, allow_redirects=True, stream=True)
    filesize = int(req.headers['Content-length'])
    with open(file_base, 'wb') as outfile:
        chunk_size=1048576
        for chunk in req.iter_content(chunk_size=chunk_size):
            outfile.write(chunk)
            if chunk_size < filesize:
                check_file_status(file_base, filesize)
    check_file_status(file_base, filesize)
    print()

资料选择

经过尝试,

surface数据要选择后缀为sfluxgrbf.tar中的201701300000.cdas1.20170129.sfluxgrbf.grib2
pressure level的数据要选择后缀为pgrbh.tar中的
201701300600.cdas1.20170130.pgrbh.grib2

这是资料文件名的命名规则

解码:

地面资料链接 Vtable.CFSR_sfc_flxf06 作为Vtable
高空资料链接 Vtable.CFSR_press_pgbh06 作为Vtable

计算:

方法和跑ECMWF资料的方法基本一致

后处理:

方法和跑ECMWF资料的方法基本一致

联系作者: DYP

©著作权归作者所有,转载或内容合作请联系作者
【社区内容提示】社区部分内容疑似由AI辅助生成,浏览时请结合常识与多方信息审慎甄别。
平台声明:文章内容(如有图片或视频亦包括在内)由作者上传并发布,文章内容仅代表作者本人观点,简书系信息发布平台,仅提供信息存储服务。

相关阅读更多精彩内容

  • Swift1> Swift和OC的区别1.1> Swift没有地址/指针的概念1.2> 泛型1.3> 类型严谨 对...
    cosWriter阅读 13,872评论 1 32
  • https://confluence.ecmwf.int//display/WEBAPI/Access+ECMWF...
    榴莲气象阅读 4,866评论 0 0
  • 人生不一定要拥有许多 只要平安健康 每天过得快乐就好 你说呢? 一、随着年龄的增长,幼稚无知、年少轻狂的自己逐渐变...
    谷一山阅读 3,541评论 1 13
  • 源码-app_main.cpp 源码-ZygoteInit.java AndroidRuntime.cpp app...
    zhi5ai阅读 4,270评论 0 1
  • 这是2017年8月的倒数第二天 普通,稀松平常,和早餐一样 无非是面条,粉,包子,粥 天气晴朗,手机不时收到 黄色...
    蒋菱阅读 1,738评论 0 1

友情链接更多精彩内容