grlib说明文档中文版
rtklib中文说明(部分)
RTKLIB provide the following general purpose C‐functions callable from user AP (application program).User can use these function to develop user original positioning APs.(1) Matrix and vector functions 矩阵与向量函数(2) Time and string functions 时间和字符串函数(3) Coordinates transformation and geoid model坐标变换和大地水准面模型(4) Navigation processing 导航处理(5) Positioning models (troposphere, ionosphere, antenna PCV) 定位模型(对流层,电离层,天线PCV)(6) SBAS DGPS/DGNSS correction DGPS / SBAS DGNSS校正(7) Single point positioning 单点定位(8) Carrier‐based and code‐based relative positioning 基于载波‐和代码‐基础相对定位(9) OTF integer ambiguity resolution OTF求解整周模糊度(10) Receiver raw binary data input 接收原始二进制数据输入(11) Positioning solution/NMEA input/output输入/输出的NMEA /定位解决方案(12) RINEX observation data/navigation message input/output RINEX观测数据和导航信息的输入/输出(13) Precise ephemeris input 精密星历的输入(14) Stream data communication library流数据通信库(15) NTRIP (Networked Transport of RTCM via Internet Protocol) library NTRIP(网络传输协议通过Internet协议)库(16) RTK‐GPS/GNSS positioning server rtk‐gps / GNSS定位服务器(17) RTCM 2.3 and 3.0/3.1/3.2 message handling RTCM 2.3和3 / 3.1 / 3.2信息处理(18) Downloader functions 下载功能The following instructions shows the way to utilize the library of RTKLIB in user AP.下面的说明显示在用户rtklib AP利用库的方式(1)Add the following include directive to the source program of user AP.#include "rtklib.h"(2)Set the following compiler option to add RTKLIB source directory path to compiler include paths.-I rtklib_<ver>\src(3)Add the necessary RTKLIB library source files to source programs set for the AP build. Refer AppendixC Library APIs for the library function list and source programs provided by RTKLIB.Appendix A CUI Command ReferencesA.1 RTKRCVSYNOPSIS 简介rtkrcv [-s][-p port|-d dev][-o file][-t level]DESCRIPTIONA command line version of the real‐time positioning AP by RTKLIB. To start or stop RTK server, toconfigure options or to print solution/status, login a console and input commands. As default, stdin/stdoutare used for the console. Use ‐p option for network login with telnet protocol. To show the availablecommands, type ?or help on the console. The initial processing options are loaded from defaultconfiguration file rtkrcv.conf. To change the file, use ‐o option. To configure the processing options, editthe configuration file or use set, load or save command on the console. To shutdown the program, useshutdown command on the console or send the USR2 signal to the process. For configuration file, refer B.4.通过rtklib的实时‐时间定位命令行版本,启动或停止RTK服务器,配置选项或打印解决方案/状态,登录控制台,输入命令。
RTKLIB中文说明书
1.文件目录结构 \app-- APs构建环境 \bin --可执行二进制APs和windows链接库 \data-- APs样本数据 \doc --文档文件 \lib --库生成环境 \src --RTKLIB库的源程序 \test--测试程序和数据 \util-- 实用程序工具 2.\bin\rtklaunch.exe 应用程序启动器3.RTKNAVI实时定位结算 输入GPS / GNSS接收机原始观测数据,实时进行导航处理。
3.1执行\bin\rtknavi.exe3.2用RTKNAVI进行实时定位必须输入GPS/GNSS接收机原始观测数据和卫星星历,点击I进入输入流对话框 检查设置Rover、Basestation、Correction三个选项的设置,如果设置定位模式,只选择一个,基站和校正并不需要。
流类型可有从以下选项中选择 (a)Serial :串口输入数据 (b)TCP Client :连接到一个TCP服务器,通过TCP连接输入数据 (c)TCP Server :接受一个TCP客户端连接和通过TCP连接的输入数据 (d)NTRIP Client :连接一个NTRIP caster输入数据 (e)File :日志文件中输入数据。
[.conf](f)FTP :通过FTP下载一个文件后输入数据 (g)HTTP :通过(a) HTTP 下载一个文件后输入数据 3.3选择流类型为?Serial?(连续的)点击...按钮设置选项3.4在流类型中如果你选择了SerialTCP Client或者TCP Server作为类型,你可以通过流设置GPS / GNSS接收机启动和关闭命令,设置命令,按下“Cmd?标签下的…按钮。
在?Serial/TCP Commands?对话框中进行设置,可以加载和保存命令3.5流类型中设置类型为?File?可以设置文件输入路径,数据为原始数据,还可以设置时间 3.6设置输出流格式,点击O按钮,弹出 ?Output Streams?对话框,设置类型,3.7类型选择?File?文件路径中的一些关键词将被日期和时间代替,按下按钮可以查看,选择?Swap Intv?输出文件在特定的周期内替换3.8输出一个输入流作为路径日志,点击按钮,弹出 ?Log Streams? 对话框,和 ?Output Streams?对画框一样路径被关键词替换3.9设置完成后,点击Start按钮。
RTKLIB2.4.3中文使用说明书
1.文件目录结构\app-- APs构建环境 \bin --可执行二进制APs和windows链接库 \data-- APs样本数据 \doc --文档文件\lib --库生成环境 \src --RTKLIB库的源程序\test--测试程序和数据 \util-- 实用程序工具2.\bin\rtklaunch.exe 应用程序启动器3.RTKNAVI实时定位结算输入GPS / GNSS接收机原始观测数据,实时进行导航处理。
3.1执行\bin\rtknavi.exe3.2用RTKNAVI进行实时定位必须输入GPS/GNSS接收机原始观测数据和卫星星历,点击I进入输入流对话框检查设置Rover、Basestation、Correction三个选项的设置,如果设置定位模式,只选择一个,基站和校正并不需要。
流类型可有从以下选项中选择(a)Serial :串口输入数据(b)TCP Client :连接到一个TCP服务器,通过TCP连接输入数据(c)TCP Server :接受一个TCP客户端连接和通过TCP连接的输入数据(d)NTRIP Client :连接一个NTRIP caster输入数据(e)File :日志文件中输入数据 。
[.conf](f)FTP :通过FTP下载一个文件后输入数据(g)HTTP :通过(a) HTTP 下载一个文件后输入数据3.3选择流类型为ʺSerialʺ(连续的)点击...按钮设置选项3.4在流类型中如果你选择了SerialTCP Client或者TCP Server作为类型,你可以通过流设置GPS / GNSS接收机启动和关闭命令,设置命令,按下“Cmdʺ标签下的…按钮。
在ʺSerial/TCP Commandsʺ对话框中进行设置,可以加载和保存命令3.5流类型中设置类型为ʺFileʺ可以设置文件输入路径,数据为原始数据,还可以设置时间3.6设置输出流格式,点击O按钮,弹出 ʺOutput Streamsʺ对话框,设置类型,3.7类型选择ʺFileʺ文件路径中的一些关键词将被日期和时间代替,按下按钮可以查看,选择ʺSwap Intvʺ输出文件在特定的周期内替换3.8输出一个输入流作为路径日志,点击按钮,弹出 ʺLog Streamsʺ 对话框,和 ʺOutput Streamsʺ对画框一样路径被关键词替换3.9设置完成后,点击Start按钮。
MATLAB克里格工具箱中文翻译版
MATLAB克里格工具箱(4.0版:2001年7月)翻译:阿童木看星星伊夫格拉顿等加拉弗勒克里格工具箱是分布式的自由和技术支持。
规格克里格工具箱4.0版MATLAB 6.1兼容。
这是一个2.0版本的升级,已matlab下实现4.2编译,编译matlab下实现5.1和3.0版本。
请注意,此升级只使用2 - D矩阵,即使新的MATLAB版本支持更大的矩阵维数。
优化功能要求的Matlab优化工具箱。
然而,站在替代自我功能还提供了人,谁没有购买优化工具箱(参见“fitvario.m”)。
志工具箱正常的研究是必要的。
它是提供与克里格工具箱。
说明此工具箱的发展是基于使用2个或3个标量的客观分析的必要性在物理海洋学的尺寸。
这种类型的插值通常比标准更好的结果插值方法。
此外,它的不可忽略的优势,使插值误差的估计。
这个工具箱的功能几乎完全是从书Deutsch和Journel(1992)和Marcotte论文(1991)。
变差函数的功能是墨西哥文件编制前,而协同克里格法的功能发表后,在Matlab格式,在1991年Marcotte的论文。
所有的参数和例子可以发现,在英国,在该两本刊物。
Journel和Huijbregts(1992)的书是最好的书semivariograms。
一个完整的例子在物理海洋学的最优估计可以登曼和弗里兰(1985)发现的文件。
同时,kridemo显示2 - D目标的轮廓分析。
登曼,吉隆坡和HJ斐然,1985年。
相关秤,客观的测绘和统计检验Geostrophy超过大陆架。
研究月RES,43:517-539。
德语,C. V和AG Journel,1992年GSLIB:地统计的软件库和用户指南。
牛津牛津大学出版社,340页。
Journel,AG和Huijbregts终审法院首席法官,1992年,矿业统计学。
学术出版社,纽约,600页。
Marcotte,D. 1991。
Cokrigeage与MATLAB。
rtklib中文说明(部分)
The following instructions shows the way to utilize the library of RTKLIB in user AP.下面的说明显示在用户rtklib AP利用库的方式(1)Add the following include directive to the source program of user AP.(2)#include ""(3)Set the following compiler option to add RTKLIB source directory path to compiler include paths.(4)-I rtklib_<ver>\src(5)Add the necessary RTKLIB library source files to source programs set for the AP build. Refer Appendix(6)C Library APIs for the library function list and source programs provided by RTKLIB.Appendix A CUI Command ReferencesRTKRCVSYNOPSIS 简介rtkrcv [-s][-p port|-d dev][-o file][-t level]DESCRIPTIONA command line version of the real‐time positioning AP by RTKLIB. To start or stop RTK server, toconfigure options or to print solution/status, login a console and input commands. As default, stdin/stdoutare used for the console. Use ‐p option for network login with telnet protocol. To show the availablecommands, type or help on the console. The initial processing options are loaded from defaultconfiguration file . To change the file, use ‐o option. To configure the processing options, editthe configuration file or use set, load or save command on the console. To shutdown the program, useshutdown command on the console or send the USR2 signal to the process. For configuration file, refer .通过rtklib的实时‐时间定位命令行版本,启动或停止RTK服务器,配置选项或打印解决方案/状态,登录控制台,输入命令。
Swift GRB产品指南说明书
Swift GRB Products GuideSwift GRB Products Guideby:Alex PadgettDavide DonatoLorella AngeliniHEASARCLaboratory for High Energy Astrophysics,NASA/GSFC,Code662,Greenbelt,MD20771Nov20091IntroductionThis document describes the on-line data products for the Swift Gamma Ray Burst Catalog.2Archive Structure and Filenames2.1Directory LayoutA complete GRB products set contains at mostfive sub-directories:•html-All requiredfiles for on-line GRB web pages•images-GIF images from each available instrument•info-FITS information table(see§2.2.4)•lightcurves-FITS light curves and plots for all available instruments•spectra-BAT and XRT spectra and plotsIf there are no archive data for a given GRB(e.g.TDRSS messages only),then there will only be an html directory.2.2File Naming Conventions2.2.1Light Curves and PlotsLight curves in FITS format are generated for each Swift intrument,and various modes of each instrument.Plots of some of these light curves are also produced,as well as plots of combinations of them.The FITS and gif light curve products archived for each GRB have the following general form:GRBNAMETable1:Left two columns:Instrument character codes and their meaning.Right two columns: Instrument mode character codes and their meaningCharacter Code MeaningXRT Only XRT PC modeBAT Only XRT WT modeUVOT Only XRT WT/PC modeMixed Instruments XRT and BATBAT Pre-Pre-slew EpochBAT Pre-slew EpochBAT In-slew EpochBAT After-slew EpochBAT All Available EpochsUVOT(allfilters)Table2:Energy band character codesCharacter Code Meaning15-350keV Allfilterse10.3-2keV-15-150keV-lc.gif’.Note that not all light curves have a corresponding plot(in fact,most will not), and some gifs are combinations of different light curves.2.2.2Spectra and Spectral PlotsIn addition to light curves,there are spectra and related plots produced for the BAT and XRT. These are named in a similar manner:GRBNAMEph.gif’in place of ’.pha’.Table3:Data type character codesCharacter Code MeaningcnfarmE(if extant),TIMEDEL,and FRACEXP columns are valid for all curves in thefile.∗∗Magnitude only applies to UVOT light curves and plots and Fitted light curve only applies to XRT light curve plots.Table4:Image Plot Character CodesCharacter Code Meaningi1i2i3[I][MM][BB][OO][TT][a-z?].gifwhere GRBNAME,I,MM,BB and OO are identical to§2.2.1.TT is the type of image as shown in Table4.A trailing character from a through z indicates that a mosaic image was split into multiple images.Table5:GRB HTML pagesMain interface html pagegrb051221a BAT and XRT spectra html pagegrb051221a Light curves html pagegrb051221a Images html page2.2.4Information FITS FileIn addition to the data products above,there will also be an information table for each GRB.This table will have at most four extensions-’XRTINFO’,’XRTFLDREG’,’XRTFITINFO’and’BATINFO’. This table will be named:GRBNAMETable6:XRT Only FITS Productsxpcetsra.lcgrb051221a XRT WT0.3-10keV source light curvesxpce1sra.lcgrb051221a XRT WT0.3-2keV source light curvesxpce2sra.lcgrb051221a XRT WT2-10keV source light curvesxpcetsrrb.lcxwtetsrrb.lcxpcetsr.phaxpcetbg.phaxpcet.arfgrb051221a XRT PC total source spectrum(excludingflares) grb051221a XRT PC total background spectrumgrb051221a XRT PC total ancillary responsefilexpcetsrt1.phaxpcetbgt1.phaxpcetsrt1.arfgrb051221a XRT PC time interval2source spectrumgrb051221a XRT PC time interval2background spectrum grb051221a XRT PC time interval2ancillary responsefile xpcetsrt3.phaxpcetbgt3.phaxpcetsrt3.arfxwtetsr.phaxwtetbg.phaxwtetsr.arfgrb051221a XRT WT total source spectrum(excludingflares) grb051221a XRT WT total background spectrumgrb051221a XRT WT total ancillary responsefilexwtetsrt1.phaxwtetbgt1.phaxwtetsrt1.arfgrb051221a XRT WT time interval2source spectrumgrb051221a XRT WT time interval2background spectrum grb051221a XRT WT time interval2ancillary responsefile xwtetsrt3.phaxwtetbgt3.phaxwtetsrt3.arfTable7:XRT Only GIF Productsxpwetsrcb XRT PC/WT0.3-10keV source cal.binned light curve grb051221a lc.gifxpwetsrrb XRT PC/WT0.3-10keV source hardness ratio binned grb051221a lc.gifgrb051221a XRT PC0.3-10keV image mosaicgrb051221a XRT PC0.3-10keV double diagonal imagegrb051221a XRT WT0.3-10keV image mosaicgrb051221a XRT WT0.3-10keV double diagonal imagegrb051221a ph.gifxpwetsrt1XRT PC/WT time interval1source spectrum grb051221a ph.gifxpwetsrt3XRT PC/WT time interval3source spectrumTable8:BAT Only ProductsFilename Description grb051221a BAT15-350keV source net light curvegrb051221a BAT4-channel source net light curvegrb051221a BAT0.3-150keV Multi-bandflux BAT-block light curves bppetsr.phabppetsr.rspbpsetsr.phabpsetsr.rspbisetsr.phabisetsr.rspbasetsr.phabasetsr.rspbbaetsr.phabbaetsr.rspGIF Products bbaetsrn BAT15-350keV source net light curve gifgrb051221a lc.gifgrb051221a ph.gifbppe2src BAT15-150keV Pre-Pre-slew source spectrum(cutoffpower-law) grb051221a ph.gifbpse2src BAT15-150keV Pre-slew source spectrum(cutoffpower-law)grb051221a ph.gifbise2src BAT15-150keV In-slew source spectrum(cutoffpower-law)grb051221a ph.gifbase2src BAT15-150keV After-slew source spectrum(cutoffpower-law) grb051221a ph.gifbbae2src BAT15-150keV Total source spectrum(cutoffpower-law)bbaetsri1.gifTable9:UVOT Only ProductsFilename Descriptiongrb051221a UVOT Allfilter source binned light curvesFilename Descriptiongrb051221a UVOT optical source imagegrb051221a UVOT optical sourcefinding chart imagegrb051221a lc.gifuuvetsrmb UVOT optical source magnitude light curvesTable10:XRT and BAT Combined ProductsFilename Description grb051221a lc.gifmxbe3srfb XRT+BAT2.0-10keV sourceflux binned light curve4Example Plots4.1XRT PlotsFigure1shows the plotted XRT PC and WT mode corrected count rate,flux and hardness ratio light curves for GRB090618.Thisfigure also contains thefitted XRT light curve,which is used to extract andfit spectra from up to three epochs.Figure2shows the extracted andfitted spectra from eachfitted epoch,and the totalfitted spectra.Figure3shows the plotted XRT PC and WT images and image mosaics for GRB090618.4.2BAT PlotsFigures4through7show typical BAT only plots.Figure4shows the BAT light curve plots.Figure 5shows the BAT spectral plots with a power-law modelfit.Figure6shows the same,but with an exponential cutoffmodelfit.And Figure7shows the pre-and post-slew BAT burst images.4.3UVOT PlotsFigure8shows theflux converted light curves for GRB090618,and the magnitude light curves for the same source.Figure9shows the highest signal to noise image from the UVOT,and the UVOT finding chart image for GRB090618.Figure 1:Clockwise from top left:GRB 090618XRT PC and WT mode corrected rate light curves;Flux light curves;Modelled light curve –dashed lines show fitted time breaks;0.3-2.0keV light curves,2-10keV light curves,and hardness ratios.Figure 2:Clockwise from top left:GRB 090618XRT PC and WT mode spectrum from the first time interval (see bottom right panel of Figure 1);Second time interval spectra;Third time interval spectra;Total spectrumFigure3:Clockwise from top left:GRB090618XRT PC double image;Per-segment XRT PC image mosaic;Per-segment XRT WT image mosaic;XRT WT double imageFigure 4:Left:GRB 0906184-channel BAT constant time bin light curves.The energy band of each light curve is in shown in the upper right corner of each panel.Right:GRB 0906181-channel (15-350keV)BAT constant time bin light curve.4.4Combined BAT/XRT PlotsFigure 10shows the combined BAT and XRT flux converted light curve plots for GRB 090618.Figure5:Clockwise from top left:GRB090618BAT pre-slew spectrumfitted with a simple power-law model normalized at50keV.GRB090618BAT slew spectrumfitted with the same model.GRB 090618BAT post-slew spectrumfitted with the same model.GRB090618total BAT spectrum fitted with the same model.Figure6:Clockwise from top left:GRB090618BAT pre-slew spectrumfitted with an exponential cut-offpower-law model normalized at50keV.GRB090618BAT slew spectrumfitted with the same model.GRB090618BAT post-slew spectrumfitted with the same model.GRB090618total BAT spectrumfitted with the same model.Figure7:Left:GRB090618BAT pre-slew image of the GRBfield-of view.Right:GRB090618 BAT post-slew image.Figure8:Left:GRB090618UVOT integratedflux light curves.Right:GRB090618UVOT magnitude light curves.Figure9:Left:GRB090618UVOT highest signal to noise image.Right:GRB090618UVOT finding chart image.Figure10:Left:GRB090618BAT and XRTflux converted light curves in0.3-10keV.Right:GRB 090618BAT and XRTflux converted light curves in2-10keV.。
[LEON_Grlib_Guide1_0325]grlib安装指南_1103修订
20101012修订 0.120110324修订 0.2GRLIB IP Library User’s ManualGRLIB IP库 用户指导手册(一)安装指南Version 1.0.22CH Reversion 1.0Jiri Gaisler, Sandi Habinc翻译: ****************Copyright Aeroflex Gaisler, 2010目录内容目录1 简介 (3)1.1 综述 (3)1.2 组织结构 (3)1.3 片上总线 (3)2 安装过程 Installation (6)2.1 安装 Installation (6)2.2 文件组织 Directory organization (7)2.3 宿主机平台支持 Host platform support (8)2.3.1 Linux (9)2.3.2 安装了Cygwin的windows (9)2.3.3 (补充)与Windows下ISE配合的折衷安装方法 (9)1简介1.1综述GRLIB IP 库是一个可重用IP核的完整集合,它是为片上系统(SOC)开发打造的。
这些IP核集中于通用的片上总线周边,采用清晰明了的方法进行仿真与综合。
1.2组织结构GRLIB is organized around VHDL libraries, where each major IP (or IP vendor) is assigned a unique library name. Using separate libraries avoids name clashes between IP cores and hides unnecessary implementation details from the end user. Each VHDL library typically contains a number of packages, declaring the exported IP cores and their interface types.Simulation and syn- thesis scripts are created automatically by a global makefile. Adding and removing of libraries and packages can be made without modifying any global files, ensuring that modification of one ven- dor’s library will not affect other vendors. A few global libraries are provided to define shared data structures and utility functions.GRLIB provides automatic script generators for the Modelsim, Ncsim, Aldec, Sonata and GHDL simulators, and the Synopsys, Synplify, Cadence, Mentor, Actel, Altera, Lattice, and Xilinx imple- mentation tools. Support for other CAD tools can be easily be added.1.3片上总线The GRLIB is designed to be ‘bus-centric’, i.e. it is assumed that most of the IP cores will be con- nected through an on-chip bus. The AMBA-2.0 AHB/APB bus has been selected as the common on-chip bus, due to its market dominance (ARM processors) and because it is well documented and can be used for free without license restrictions. The figure below shows an example of a LEON3 system designed with GRLIB1.4Distributed address decodingAdding an IP core to the AHB bus is unfortunately not as straight-forward as just connecting the bus signals. The address decoding of AHB is centralized, and a shared address decoder and bus multiplexer must be modified each time an IP core is added or removed. To avoid dependencies on a global resource, distributed address decoding has been added to the GRLIB cores and AMBA AHB/APB controllers.1.5Interrupt steeringGRLIB provides a unified interrupt handling scheme by adding 32 interrupt signals to the AHB and APB buses. An AMBA module can drive any of the interrupts, and the unit that implements the interrupt controller can monitor the combined interrupt vector and generate the appropriate processor interrupt. In this way, interrupts can be generated regardless of which processor or inter- rupt controller is being used in the system, and does not need to be explicitly routed to a global resource. The scheme allows interrupts to be shared by several cores and resolved by software.1.6Plug&Play capabilityA broad interpretation of the term ‘plug&play’ is the capability to detect the system hardwarecon- figuration through software. Such capability makes it possible to use software application or oper-ating systems which automatically configure themselves to match the underlying hardware. This greatly simplifies the development of software applications, since they do not need to be custom- ized for each particular hardware configuration.In GRLIB, the plug&play information consists of three items: a unique IP core ID, AHB/APB memory mapping, and used interrupt vector. This information is sent as a constant vector to the bus arbiter/decoder, where it is mapped on a small read-only area in the top of the address space.Any AHB master can read the system configuration using standard bus cycles, and a plug&play operating system can be supported.To provide the plug&play information from the AMBA units in a harmonized way, a configuration record for AMBA devices has been defined (figure 1). The configuration record consists of 8 32- bit words, where four contain configuration words defining the core type and interrupt routing, and four contain so called ‘bank address registers’ (BAR), defining the memory mapping.The configuration word for each device includes a vendor ID, device ID, version number, and interrupt routing information. A configuration type indicator is provided to allow for future evolve-ment of the configuration word. The BARs contain the start address for an area allocated to the device, a mask defining the size of the area, information whether the area is cacheable or pre-fetch- able, and a type declaration identifying the area as an AHB memory bank, AHB I/O bank or APB I/O bank. The configuration record can contain up to four BARs and the core can thus be mapped on up to four distinct address areas.2安装过程 Installation2.1安装 Installation(本系统安装过程演示在RedHat Linux中进行,其他*nix和cygwin等环境应大体相同)GRLIB is distributed as a gzipped tar-file and can be installed in any location on the host system: GRLIB 是以gzip压缩格式发布的,可以在宿主机的任意位置发布。
Fragstats4.1帮助文档中文版(谷歌翻译)
概述什么是FRAGSTATSFRAGSTATS是空间格局的分析程序来表示景观结构的景观镶嵌模型分类地图。
请注意,FRAGSTATS不适合代表景观结构的景观梯度模型的连续表面地图。
景观受分析是用户定义的,并且可以表示任何空间的现象。
FRAGSTATS简单量化作为分类地图所代表的景观空间异质性;这是义不容辞的用户建立了良好的基础定义和缩放景观的主题内容和分辨率和空间的粮食和程度方面。
我们强烈建议您阅读使用该程序之前,FRAGSTATS背景部分。
重要的是,从FRAGSTATS输出才有意义,如果定义的景观是有意义的相对于正在审议的现象。
规模的注意事项FRAGSTATS需要的空间谷物或网格的分辨率是>0.001米,但它放置没有限制对景观本身的空间范围,虽然有对可加载的网格的尺寸存储器的限制。
然而,在计算FRAGSTATS的移动距离和面积为基础的度量报道平方米,公顷,分别。
因此,极端的程度和/或分辨率的景观可导致相当麻烦的数字和/或受舍入误差。
然而,FRAGSTATS输出,可以使用任何数据库管理程序,以重新调整指标或将其转换为其他单位(例如,转换公顷亩)被操纵以ASCII格式的数据文件。
计算机要求FRAGSTATS是一个用微软的Visual C单机+ +程序在Windows操作系统环境中使用,是一个32位进程(即使运行的是64位计算机上)。
FRAGSTATS的开发和在Windows 7操作系统上进行测试,尽管它应该在所有的Windows操作系统上运行。
请注意,FRAGSTATS是高度依赖于平台,因为它是在Microscroft环境下开发的,所以移植到其他平台上是不容易实现的。
FRAGSTATS是计算密集型的程序;其性能取决于两个处理器速度和计算机存储器(RAM)。
最后,处理的图像的能力依赖于足够的存储器可用,并且处理该图像的速度依赖于处理器速度。
特别值得注意的是该存储器的约束。
FRAGSTATS是一个32位的过程,因此,最多只能使用2GB的内存;但如果正确配置Windows可以让32位进程看高达3GB的内存。
RTKLIB 中文说明书
1.文件目录结构\app-- APs构建环境 \bin --可执行二进制APs和windows链接库 \data-- APs样本数据 \doc --文档文件\lib --库生成环境 \src --RTKLIB库的源程序\test--测试程序和数据 \util-- 实用程序工具2.\bin\rtklaunch.exe 应用程序启动器3.RTKNAVI实时定位结算输入GPS / GNSS接收机原始观测数据,实时进行导航处理。
3.1执行\bin\rtknavi.exe3.2用RTKNAVI进行实时定位必须输入GPS/GNSS接收机原始观测数据和卫星星历,点击I进入输入流对话框检查设置Rover、Basestation、Correction三个选项的设置,如果设置定位模式,只选择一个,基站和校正并不需要。
流类型可有从以下选项中选择(a)Serial :串口输入数据(b)TCP Client :连接到一个TCP服务器,通过TCP连接输入数据(c)TCP Server :接受一个TCP客户端连接和通过TCP连接的输入数据(d)NTRIP Client :连接一个NTRIP caster输入数据(e)File :日志文件中输入数据 。
[.conf](f)FTP :通过FTP下载一个文件后输入数据(g)HTTP :通过(a) HTTP 下载一个文件后输入数据3.3选择流类型为ʺSerialʺ(连续的)点击...按钮设置选项3.4在流类型中如果你选择了SerialTCP Client或者TCP Server作为类型,你可以通过流设置GPS / GNSS接收机启动和关闭命令,设置命令,按下“Cmdʺ标签下的…按钮。
在ʺSerial/TCP Commandsʺ对话框中进行设置,可以加载和保存命令3.5流类型中设置类型为ʺFileʺ可以设置文件输入路径,数据为原始数据,还可以设置时间3.6设置输出流格式,点击O按钮,弹出 ʺOutput Streamsʺ对话框,设置类型,3.7类型选择ʺFileʺ文件路径中的一些关键词将被日期和时间代替,按下按钮可以查看,选择ʺSwap Intvʺ输出文件在特定的周期内替换3.8输出一个输入流作为路径日志,点击按钮,弹出 ʺLog Streamsʺ 对话框,和 ʺOutput Streamsʺ对画框一样路径被关键词替换3.9设置完成后,点击Start按钮。
interpretR 0.2.5 用户手册说明书
Package‘interpretR’August20,2023Type PackageTitle Binary Classifier and Regression Model Interpretation FunctionsVersion0.2.5Date2023-08-19Depends randomForestImports AUC,stats,graphicsAuthor Michel Ballings and Dirk Van den PoelMaintainer Michel Ballings<*************************>Description Compute permutation-based performance measures and create partialdependence plots for(cross-validated)'randomForest'and'ada'models.License GPL(>=2)RoxygenNote7.2.3NeedsCompilation noRepository CRANDate/Publication2023-08-1922:22:31UTCR topics documented:interpretR-package (2)interpretRNews (2)parDepPlot (3)variableImportance (5)Index712interpretRNews interpretR-package Partial Dependence Plots and Permutation-Based Performance Mea-suresDescriptionCompute permutation-based performance meaures(for binary classification)and create partial de-pendence plots(cross-validated classification and regression models).Currently only binary clas-sification and regression models estimated with the package randomForest are supported.Binary classification models estimated with ada are also supported.Author(s)Authors:Michel Ballings,and Dirk Van den Poel,Maintainer:<*************************>See AlsoparDepPlot,variableImportanceinterpretRNews Display the NEWSfileDescriptioninterpretRNews shows the NEWSfile of the interpretR package.UsageinterpretRNews()Author(s)Authors:Michel Ballings and Dirk Van den Poel,Maintainer:<*************************>See AlsoparDepPlotExamplesinterpretRNews()parDepPlot Model interpretation functions:Partial Dependence PlotsDescriptionparDepPlot creates partial dependence plots for binary(cross-validated)classification models andregression models.Currently only binary classification models estimated with the packages randomForest and ada are supported.In addition randomForest regression models are supported.UsageparDepPlot(,object,data,rm.outliers=TRUE,fact=1.5,n.pt=50,robust=FALSE,ci=FALSE,u.quant=0.75,l.quant=0.25,xlab=substr(,1,50),ylab=NULL,main=if(any(class(object)%in%c("randomForest","ada")))paste("Partial Dependence on",substr(,1,20))elsepaste("Cross-Validated Partial Dependence on",substr(,1,10)),logit=TRUE,ylim=NULL,...)Arguments the name of the predictor as a character string for which a partial dependenceplot has to be created.object can be a model or a list of cross-validated models.Currently only binary classi-fication models built using the packages randomForest and ada are supported.data a data frame containing the predictors for the model or a list of data frames forcross-validation with length equal to the number of models.rm.outliers boolean,remove the outliers in .Outliers are values that are smaller thanmax(Q1-fact*IQR,min)or greater than min(Q3+fact*IQR,max).Overridden ifxlim is used.fact factor to use in rm.outliers.The default is1.5.n.pt if is a continuous predictor,the number of points that will be used to plotthe curve.robust if TRUE then the median is used to plot the central tendency(recommended when logit=FALSE).If FALSE the mean is used.ci boolean.Should a confidence interval based on quantiles be plotted?This only works if robust=TRUE.u.quant Upper quantile for ci.This only works if ci=TRUE and robust=TRUE.l.quant Lower quantile for ci.This only works if ci=TRUE and robust=TRUE.xlab label for the x-axis.Is determined automatically if NULL.ylab label for the y-axis.main main title for the plot.logit boolean.Should the y-axis be on a logit scale or not?If FALSE,it is recom-mended to set robust=TRUE.Only applicable for classifcation.ylim The y limits of the plot...other graphical parameters for plot.DetailsFor classification,the response variable in the model is always assumed to take on the values{0,1}.Resulting partial dependence plots always refer to class1.Whenever strange results are obtained the user has three options.First set rm.outliers=TRUE.Second,if that doesn’t help,set robust=TRUE.Finally,if that doesn’t help,the user can also try setting ci=TRUE.Areas with larger confidence intervals typically indicate problem areas.These options help the user tease out the root of strange results and converge to better parameter values.Author(s)Authors:Michel Ballings,and Dirk Van den Poel,Maintainer:<*************************> ReferencesThe code in this function uses part of the code from the partialPlot function in randomForest.It is expanded and generalized to support cross-validation and other packages.See AlsovariableImportanceExampleslibrary(randomForest)#Prepare datadata(iris)iris<-iris[1:100,]iris$Species<-as.factor(ifelse(factor(iris$Species)=="setosa",0,1))#Cross-validated models#Estimate10models and create10test setsdata<-list()rf<-list()for(i in1:10){ind<-sample(nrow(iris),50)rf[[i]]<-randomForest(Species~.,iris[ind,])data[[i]]<-iris[-ind,]}parDepPlot(="Petal.Width",object=rf,data=data)#Single model#Estimate a single modelind<-sample(nrow(iris),50)rf<-randomForest(Species~.,iris[ind,])parDepPlot(="Petal.Width",object=rf,data=iris[-ind,])variableImportance Permutation-based Variable Importance MeasuresDescriptionvariableImportance produces permutation-based variable importance measures(currently onlyfor binary classification models from the package randomForest and only for the performancemeasure AUROC)UsagevariableImportance(object=NULL,xdata=NULL,ydata=NULL,CV=3,measure="AUROC",sort=TRUE)Argumentsobject A model.Currently only binary classification models from the package randomForest.xdata A data frame containing the predictors for the model.ydata A factor containing the response variable.CV Cross-validation.How many times should the data be permuted and the decreasein performance be calculated?Afterwards the mean is taken.CV should behigher for very small samples to ensure stability.measure Currently only Area Under the Receiver Operating Characteristic Curve(AU-ROC)is supported.sort Logical.Should the results be sorted from high to low?DetailsCurrently only binary classification models from randomForest are supported.Also,currently only AUROC is supported.Definition of MeanDecreaseAUROC:for the entire ensemble the AUROC is recorded on the provided xdata.The same is subsequently done after permuting each variable (iteratively,for each variable separately).Then the latter is subtracted from the former.This is called the Decrease in AUROC.If we do this for multiple CV,it becomes the Mean Decrease in AUROC.ValueA data frame containing the variable names and the mean decrease in AUROCAuthor(s)Authors:Michel Ballings,and Dirk Van den Poel,Maintainer:<*************************> See AlsoparDepPlotExamples#Prepare datadata(iris)iris<-iris[1:100,]iris$Species<-as.factor(ifelse(factor(iris$Species)=="setosa",0,1))#Estimate modellibrary(randomForest)ind<-sample(nrow(iris),50)rf<-randomForest(Species~.,iris[ind,])#Obtain variable importancesvariableImportance(object=rf,xdata=iris[-ind,names(iris)!="Species"],ydata=iris[-ind,]$Species)IndexinterpretR(interpretR-package),2 interpretR-package,2interpretRNews,2parDepPlot,2,3,6variableImportance,2,4,57。
rtklib中文说明(部分)
RTKLIB provide the following general purpose C‐functions callable from user AP (application program).User can use these function to develop user original positioning APs.(1) Matrix and vector functions 矩阵与向量函数(2) Time and string functions 时间和字符串函数(3) Coordinates transformation and geoid model坐标变换和大地水准面模型(4) Navigation processing 导航处理(5) Positioning models (troposphere, ionosphere, antenna PCV) 定位模型(对流层,电离层,天线PCV)(6) SBAS DGPS/DGNSS correction DGPS / SBAS DGNSS校正(7) Single point positioning 单点定位(8) Carrier‐based and code‐based relative positioning 基于载波‐和代码‐基础相对定位(9) OTF integer ambiguity resolution OTF求解整周模糊度(10) Receiver raw binary data input 接收原始二进制数据输入(11) Positioning solution/NMEA input/output输入/输出的NMEA /定位解决方案(12) RINEX observation data/navigation message input/output RINEX观测数据和导航信息的输入/输出(13) Precise ephemeris input 精密星历的输入(14) Stream data communication library流数据通信库(15) NTRIP (Networked Transport of RTCM via Internet Protocol) library NTRIP(网络传输协议通过Internet协议)库(16) RTK‐GPS/GNSS positioning server rtk‐gps / GNSS定位服务器(17) RTCM 2.3 and 3.0/3.1/3.2 message handling RTCM 2.3和3 / 3.1 / 3.2信息处理(18) Downloader functions 下载功能The following instructions shows the way to utilize the library of RTKLIB in user AP.下面的说明显示在用户rtklib AP利用库的方式(1)Add the following include directive to the source program of user AP.#include "rtklib.h"(2)Set the following compiler option to add RTKLIB source directory path to compiler include paths.-I rtklib_<ver>\src(3)Add the necessary RTKLIB library source files to source programs set for the AP build. Refer AppendixC Library APIs for the library function list and source programs provided by RTKLIB.Appendix A CUI Command ReferencesA.1 RTKRCVSYNOPSIS 简介rtkrcv [-s][-p port|-d dev][-o file][-t level]DESCRIPTIONA command line version of the real‐time positioning AP by RTKLIB. To start or stop RTK server, toconfigure options or to print solution/status, login a console and input commands. As default, stdin/stdoutare used for the console. Use ‐p option for network login with telnet protocol. To show the availablecommands, type ?or help on the console. The initial processing options are loaded from defaultconfiguration file rtkrcv.conf. To change the file, use ‐o option. To configure the processing options, editthe configuration file or use set, load or save command on the console. To shutdown the program, useshutdown command on the console or send the USR2 signal to the process. For configuration file, refer B.4.通过rtklib的实时‐时间定位命令行版本,启动或停止RTK服务器,配置选项或打印解决方案/状态,登录控制台,输入命令。
GR CORE中文版
2. 可靠性保证程序……………………………………………………………………………13 2.1 供应商资格审查和器件验证……………………………………………………13 2.1.1 规范和控制………………………………………………………………………14 2.1.2 供应商资格审查…………………………………………………………………14 2.1.3 器件验证一般程序的相关标准…………………………………………………15 2.1.3.1 验证测试文件………………………………………………………15 2.1.3.2 相似器件的验证……………………………………………………16 2.1.3. 3 验证的组件水平(略) 2.1.3.4 临时使用的器件…………………………………………………………16 2.1.3.5 使用供应商提供的资料…………………………………………………17 2.1.3.6 内部制造器件的处理……………………………………………………17 2.1.3.7 验证测试的抽样……………………………………………………17 2.1.3.7.1 LTPD 抽样方案…………………………………………………17 2.1.3.7.2 验证中不合格器件的使用……………………………………18 2.1.3.7.3 小批量器件的处理……………………………………………18 2.1.3.7.4 附加样品的特性测试数据……………………………………19 2.1.3.7.5 应力测试的附加考虑因素……………………………………19 2.1.3.8 验证失败的器件规则………………………………………………19 2.1.4 重新验证…………………………………………………………………………19 2.2 逐批控制……………………………………………………………………………21 2.2.1 批的定义………………………………………………………………………21 2.2.2 采购规范………………………………………………………………………22
Extract-Transform-Load框架v0.4.1用户指南说明书
Package‘etl’October12,2023Type PackageTitle Extract-Transform-Load Framework for Medium DataVersion0.4.1Maintainer Benjamin S.Baumer<********************>Description A predictable and pipeable framework for performing ETL(extract-transform-load)operations on publicly-accessible medium-sized dataset.This package sets up the method structure and implements genericfunctions.Packages that depend on this package download specific data setsfrom the Internet,clean them up,and import them into a local or remoterelational database management system.License CC0Imports DBI,dbplyr,datasets,downloader,fs,janitor,lubridate,methods,readr,rlang,rvest,tibble,usethis,utils,xml2Depends R(>=2.10),dplyrSuggests knitr,RSQLite,RPostgreSQL,RMySQL,ggplot2,testthat,rmarkdownURL https:///beanumber/etlBugReports https:///beanumber/etl/issuesRoxygenNote7.2.3Encoding UTF-8VignetteBuilder knitr,rmarkdown,ggplot2,dplyr,dbplyrNeedsCompilation noAuthor Benjamin S.Baumer[aut,cre](<https:///0000-0002-3279-0516>), Carson Sievert[ctb],Natalia Iannucci[ctb]Repository CRANDate/Publication2023-10-1219:10:02UTC12create_etl_package R topics documented:create_etl_package (2)dbRunScript (3)dbWipe (4)db_type (4)etl (5)etl_cleanup (7)etl_init (10)match_files_by_year_months (12)smart_download (13)smart_upload (14)src_mysql_cnf (14)valid_year_month (15)Index17 create_etl_package Create an ETL package skeletonDescriptionCreate an ETL package skeletonUsagecreate_etl_package(...)Arguments...arguments passed to create_packageDetailsExtends create_package and places a template sourcefile in the R subdirectory of the new pack-age.Thefile has a working stub of etl_extract.The new package can be built immediately and run.New S3methods for etl_transform and etl_load can be added if necessary,but the default methods may suffice.See Alsoetl_extract,etl_transform,etl_loaddbRunScript3Examples##Not run:path<-file.path(tempdir(),"scorecard")create_etl_package(path)##End(Not run)#Now switch projects,and"Install and Restart"dbRunScript Execute an SQL scriptDescriptionExecute an SQL scriptUsagedbRunScript(conn,script,echo=FALSE,...)Argumentsconn a DBIConnection-class objectscript Either afilename pointing to an SQL script or a character vector of length1 containing SQL.echo print the SQL commands to the output?...arguments passed to dbExecuteDetailsThe SQL scriptfile must be;delimited.Valuea list of results from dbExecute for each of the individual SQL statements in script. Examplessql<-"SHOW TABLES;SELECT1+1as Two;"sql2<-system.file("sql","mtcars.mysql",package="etl")sql3<-"SELECT*FROM user WHERE user= mysql ;SELECT*FROM user WHERE t = t ;"if(require(RSQLite)){con<-dbConnect(RSQLite::SQLite())dbRunScript(con,"SELECT1+1as Two;VACUUM;ANALYZE;")}##Not run:if(require(RMySQL)){con<-dbConnect(RMySQL::MySQL(),default.file=path.expand("~/f"),4db_type group="client",user=NULL,password=NULL,dbname="mysql",host="127.0.0.1") dbRunScript(con,script=sql)dbRunScript(con,script=sql2)dbRunScript(con,script=sql3)dbDisconnect(con)}##End(Not run)dbWipe Wipe out all tables in a databaseDescriptionWipe out all tables in a databaseUsagedbWipe(conn,...)Argumentsconn A DBIConnection object,as returned by dbConnect()....Other parameters passed on to methods.DetailsFinds all tables within a database and removes themdb_type Return the database type for an ETL or DBI connectionDescriptionReturn the database type for an ETL or DBI connectionUsagedb_type(obj,...)##S3method for class src_dbidb_type(obj,...)##S3method for class DBIConnectiondb_type(obj,...)etl5Argumentsobj and etl or DBIConnection-class object...currently ignoredExamplesif(require(RMySQL)&&mysqlHasDefault()){#connect to test database using rs-dbidb<-src_mysql_cnf()class(db)db#connect to another server using the client groupdb_type(db)db_type(db$con)}etl Initialize an etl objectDescriptionInitialize an etl objectUsageetl(x,db=NULL,dir=tempdir(),...)##Default S3method:etl(x,db=NULL,dir=tempdir(),...)##S3method for class etlsummary(object,...)is.etl(object)##S3method for class etlprint(x,...)Argumentsx the name of the etl package that you wish to populate with data.This deter-mines the class of the resulting etl object,which determines method dispatchof etl_*()functions.There is no default,but you can use mtcars as an testexample.db a database connection that inherits from src_dbi.It is NULL by default,which results in a SQLite connection being created in dir.dir a directory to store the raw and processed datafiles6etl ...arguments passed to methods(currently ignored)object an object for which a summary is desired.DetailsA constructor function that instantiates an etl object.An etl object extends a src_dbi object.Italso has attributes for:pkg the name of the etl package corresponding to the data sourcedir the directory where the raw and processed data are storedraw_dir the directory where the raw datafiles are storedload_dir the directory where the processed datafiles are storedJust like any src_dbi object,an etl object is a data source backed by an SQL database.However,an etl object has additional functionality based on the presumption that the SQL database willbe populated from datafiles stored on the local hard disk.The ETL functions documented in etl_create provide the necessary functionality for extract ing data from the Internet to raw_dir, transform ing those data and placing the cleaned up data(usually in CSV format)into load_dir,andfinally load ing the clean data into the SQL database.ValueFor etl,an object of class etl_x and etl that inherits from src_dbiFor is.etl,TRUE or FALSE,depending on whether x has class etlSee Alsoetl_createExamples#Instantiate the etl objectcars<-etl("mtcars")str(cars)is.etl(cars)summary(cars)##Not run:#connect to a PostgreSQL serverif(require(RPostgreSQL)){db<-src_postgres("mtcars",user="postgres",host="localhost")cars<-etl("mtcars",db)}##End(Not run)#Do it step-by-stepcars%>%etl_extract()%>%etl_transform()%>%etl_load()src_tbls(cars)cars%>%tbl("mtcars")%>%group_by(cyl)%>%summarize(N=n(),mean_mpg=mean(mpg))#Do it all in one stepcars2<-etl("mtcars")cars2%>%etl_update()src_tbls(cars2)#generic summary function provides information about the objectcars<-etl("mtcars")summary(cars)cars<-etl("mtcars")#returns TRUEis.etl(cars)#returns FALSEis.etl("hello world")cars<-etl("mtcars")%>%etl_create()carsetl_cleanup ETL functions for working with medium sized dataDescriptionThese generic functions provide a systematic approach for performing ETL(exchange-transform-load)operations on medium sized data.Usageetl_cleanup(obj,...)##Default S3method:etl_cleanup(obj,delete_raw=FALSE,delete_load=FALSE,pattern="\\.(csv|zip)$",...)etl_create(obj,...)##Default S3method:etl_create(obj,...)etl_update(obj,...)##Default S3method:etl_update(obj,...)etl_extract(obj,...)##Default S3method:etl_extract(obj,...)##S3method for class etl_mtcarsetl_extract(obj,...)##S3method for class etl_citiesetl_extract(obj,...)etl_load(obj,...)##Default S3method:etl_load(obj,...)etl_transform(obj,...)##Default S3method:etl_transform(obj,...)##S3method for class etl_citiesetl_transform(obj,...)Argumentsobj an etl object...arguments passed to methodsdelete_raw shouldfiles be deleted from the raw_dir?delete_load shouldfiles be deleted from the load_dir?pattern regular expression matchingfile names to be deleted.By default,this matches filenames ending in.csv and.zip.DetailsThe purposes of these functions are to download data from a particular data source from the Internet, process it,and load it into a SQL database server.There arefive primary functions:etl_init Initialize the database schema.etl_extract Download data from the Internet and store it locally in its raw form.etl_transform Manipulate the raw data such that it can be loaded into a database ually, this means converting the raw data to(a series of)CSVfiles,which are also stored locally.etl_load Load the transformed data into the database.etl_cleanup Perform housekeeping,such as deleting unnecessary raw datafiles.Additionally,two convenience functions chain these operations together:etl_create Run allfive functions in succession.This is useful when you want to create the database from scratch.etl_update Run the etl_extract-etl_transform-etl_load functions in succession.This is use-ful when the database already exists,but you want to insert some new data.ValueEach one of these functions returns an etl object,invisibly.See Alsoetl,etl_initExamples##Not run:if(require(RPostgreSQL)){db<-src_postgres(dbname="mtcars",user="postgres",host="localhost")cars<-etl("mtcars",db)}if(require(RMySQL)&&mysqlHasDefault()){db<-src_mysql(dbname="mtcars",user="r-user",host="localhost",password="mypass")cars<-etl("mtcars",db)}##End(Not run)cars<-etl("mtcars")cars%>%etl_extract()%>%etl_transform()%>%etl_load()%>%etl_cleanup()carscars%>%tbl(from="mtcars")%>%group_by(cyl)%>%summarise(N=n(),mean_mpg=mean(mpg))#do it all in one step,and peek at the SQL creation script10etl_init cars%>%etl_create(echo=TRUE)#specify a directory for the data##Not run:cars<-etl("mtcars",dir="~/dumps/mtcars/")str(cars)##End(Not run)cars<-etl("mtcars")#Do it step-by-stepcars%>%etl_extract()%>%etl_transform()%>%etl_load()#Note the somewhat imprecise data types for the columns.These are the default.tbl(cars,"mtcars")#But you can also specify your own schema if you wantschema<-system.file("sql","init.sqlite",package="etl")cars%>%etl_init(schema)%>%etl_load()etl_init Initialize a database using a defined schemaDescriptionInitialize a database using a defined schemaUsageetl_init(obj,script=NULL,schema_name="init",pkg=attr(obj,"pkg"),ext=NULL,...)##Default S3method:etl_init(obj,script=NULL,schema_name="init",pkg=attr(obj,"pkg"),ext=NULL,etl_init11 ...)find_schema(obj,schema_name="init",pkg=attr(obj,"pkg"),ext=NULL,...) Argumentsobj An etl objectscript either a vector of SQL commands to be executed,or afile path as a character vector containing an SQL initialization script.If NULL(the default),then theappropriate built-in schema will be fetched by find_schema,if it exists.Notethat theflavor of SQL in thisfile must match the type of the source.That is,ifyour object is of type src_mysql,then make sure that the schema you specifyhere is written in MySQL(and not PostgreSQL).Please note that SQL syntax isnot,in general,completely e with caution,as this may clobber anyexisting data you have in an existing database.schema_name The name of the schema.Default is init.pkg The package defining the schema.Should be set in etl.ext Thefile extension used for the SQL schemafile.If NULL(the default)it be inferred from the src_*class of con.For example,if con has class SQLite thenext will be sqlite....Currently ignoredDetailsIf the table definitions are at all non-trivial,you may wish to include a pre-defined table schema.This function will retrieve it.Examplescars<-etl("mtcars")cars%>%etl_init()cars%>%etl_init(script=sql("CREATE TABLE IF NOT EXISTS mtcars_alt(id INTEGER);"))cars%>%etl_init(schema_name="init")init_script<-find_schema(cars,schema_name="init")cars%>%etl_init(script=init_script,echo=TRUE)src_tbls(cars)cars<-etl("mtcars")find_schema(cars)find_schema(cars,"init","etl")find_schema(cars,"my_crazy_schema","etl")12match_files_by_year_months match_files_by_year_monthsMatch year and month vectors tofilenamesDescriptionMatch year and month vectors tofilenamesExtracts a date fromfilenamesUsagematch_files_by_year_months(files,pattern,years=as.numeric(format(Sys.Date(),"%Y")),months=1:12,...)extract_date_from_filename(files,pattern,...)Argumentsfiles a character vector offilenamespattern a regular expression to be passed to fast_strptimeyears a numeric vector of yearsmonths a numeric vector of months...arguments passed to fast_strptimeValuea character vector of files that match the pattern,year,and month argumentsa vector of POSIXct dates matching the patternExamples##Not run:if(require(airlines)){airlines<-etl("airlines",dir="~/Data/airlines")%>%etl_extract(year=1987)summary(airlines)match_files_by_year_months(list.files(attr(airlines,"raw_dir")),pattern="On_Time_On_Time_Performance_%Y_%m.zip",year=1987)}##End(Not run)smart_download13 smart_download Download only thosefiles that don’t already existDescriptionDownload only thosefiles that don’t already existUsagesmart_download(obj,src,new_filenames=basename(src),clobber=FALSE,...)Argumentsobj an etl objectsrc a character vector of URLs that you want to downloadnew_filenames an optional character vector offilenames for the new(local)files.Defaults to having the samefilenames as those in src.clobber do you want to clobber any existingfiles?...arguments passed to downloadDetailsDownloads only thosefiles in src that are not already present in the directory specified by the raw_dir attribute of obj.Author(s)idiom courtesy of Hadley WickhamExamples##Not run:cars<-etl("mtcars")urls<-c("https:///beanumber/etl/master/etl.Rproj","https:///robots.txt")smart_download(cars,src=urls)#won t download again if the files are already theresmart_download(cars,src=urls)#use clobber to overwritesmart_download(cars,src=urls,clobber=TRUE)##End(Not run)14src_mysql_cnf smart_upload Upload a list offiles to the DBDescriptionUpload a list offiles to the DBUsagesmart_upload(obj,src=NULL,tablenames=NULL,...)Argumentsobj An etl objectsrc a list of CSVfiles to upload.If NULL,will return all CSVs in the load directory tablenames a list the same length as src of tablenames in the database corresponding to eachof thefiles in src.If NULL,will default to the same name as src,without pathsorfile extensions....arguments passed to dbWriteTableExamples##Not run:if(require(RMySQL)){#must have pre-existing database"fec"#if not,trysystem("mysql-e CREATE DATABASE IF NOT EXISTS fec; ")db<-src_mysql_cnf(dbname="mtcars")}##End(Not run)src_mysql_cnf Connect to local MySQL Server using~/fDescriptionConnect to local MySQL Server using~/fUsagesrc_mysql_cnf(dbname="test",groups="rs-dbi",...)Argumentsdbname name of the local database you wish to connect to.Default is test,as in mysqlHasDefault.groups section of~/ffile.Default is rs-dbi as in mysqlHasDefault...arguments passed to src_mysqlSee Alsosrc_mysql,mysqlHasDefaultExamplesif(require(RMySQL)&&mysqlHasDefault()){#connect to test database using rs-dbidb<-src_mysql_cnf()class(db)db#connect to another server using the client groupsrc_mysql_cnf(groups="client")}valid_year_month Ensure that years and months are within a certain time spanDescriptionEnsure that years and months are within a certain time spanUsagevalid_year_month(years,months,begin="1870-01-01",end=Sys.Date()) Argumentsyears a numeric vector of yearsmonths a numeric vector of monthsbegin the earliest valid date,defaults to the UNIX epochend the most recent valid date,defaults to todayDetailsOften,a data source will begin and end at known points in time.At the same time,many data sources are divided into monthly archives.Given a set of years and months,any combination of which should be considered valid,this function will return a data.frame in which each row is one of those valid year-month pairs.Further,if the optional begin and end arguments are specified,the rows will befilter to lie within that time interval.Furthermore,thefirst and last day of each month are computed.Valuea data.frame with four variables:year,month,month_begin(thefirst day of the month),andmonth_end(the last day of the month).Examplesvalid_year_month(years=1999:2001,months=c(1:3,7))#Mets in the World Series since the UNIX epochmets_ws<-c(1969,1973,1986,2000,2015)valid_year_month(years=mets_ws,months=10)#Mets in the World Series during the Clinton administrationif(require(ggplot2)){clinton<-filter(presidential,name=="Clinton")valid_year_month(years=mets_ws,months=10,begin=clinton$start,end=clinton$end)}Indexcreate_etl_package,2create_package,2data.frame,15,16db_type,4dbConnect(),4dbExecute,3DBIConnection,4dbRunScript,3dbWipe,4dbWriteTable,14download,13etl,5,5,6,8,9,11,13,14etl_cleanup,7etl_create,6etl_create(etl_cleanup),7etl_extract,2etl_extract(etl_cleanup),7etl_init,9,10etl_load,2etl_load(etl_cleanup),7etl_transform,2etl_transform(etl_cleanup),7etl_update(etl_cleanup),7extract_date_from_filename(match_files_by_year_months),12fast_strptime,12find_schema,11find_schema(etl_init),10is.etl,6is.etl(etl),5match_files_by_year_months,12 mysqlHasDefault,15POSIXct,12print.etl(etl),5smart_download,13smart_upload,14SQLite,5,11src_dbi,5,6src_mysql,11,15src_mysql_cnf,14summary.etl(etl),5valid_year_month,15 17。
RWildbook R包用户指南说明书
Package‘RWildbook’October12,2022Type PackageTitle Interface for the'Wildbook'Wildlife Data Management FrameworkVersion0.9.3DescriptionProvides an interface with the'Wildbook'mark-recapture ecological database framework.Ithelps users to pull data from the'Wildbook'framework and format data for further analysiswith mark-recapture applications like'Program MARK'(which can be ac-cessed via the'RMark'package in'R').Further information on the'Wildbook'framework is available at:<http:///doku.php>.License GPL(>=2)Depends R(>=3.0.0),jsonlite,data.table,utilsImports markedRoxygenNote5.0.1Suggests knitr,rmarkdownVignetteBuilder knitrNeedsCompilation noAuthor Simon Bonner[aut,cre],Xinxin Huang[aut]Maintainer Simon Bonner<***************>Repository CRANDate/Publication2018-04-0603:15:09UTCR topics documented:dateTOmillisecond (2)filterstring (2)get_os (3)markedData (3)RWildbook (5)searchWB (6)12filterstring sexstring (8)vignette_1_data (9)vignette_2_data (9)WBjdoql (10)WBsearchURL (11)Index12 dateTOmillisecond Transform a vector of date to a vector of millisecond.DescriptionThis function is to transform a period of time in date to millisecond according to the origin date and the format of date.UsagedateTOmillisecond(date,origin="1970-01-01",format="%Y-%m-%d",interval=TRUE)Argumentsdate A character vector represent a period of time from date[1]to date[2].origin A point of time which set to be zero in millisecond.format A format for date and origin arguments.interval A logical argument which equals to TRUE by default when the function transfer an period of dates to millisecond.When interval=FALSE,the function returnsthe last minute of every elements of date to millisecons.DetailsA vector of size two represent a period of time.The start date of the period will be transform to thefirst millisecond of the date and the end of the period to the last millisecond.filterstring Generate the JDOQL part of afilter.DescriptionThis function is to generate the string in the JDOQL query for some given value of afilter. Usagefilterstring(filtername,filtervalues,logic="||",bridge="==")get_os3 Argumentsfiltername A character which is for the responding variable name in Wildbook framework.filtervalues A vector of the value for thefilter.logic A parameter which can be"&&"for the logical AND or"||"for the logical OR.bridge An operator to connect the name and the default value is"==".get_os Identify operating system as Mac,*nix,or WindowsDescriptionCopied from Hadley Wickham’s rappdir package at his suggestion.The rappdir package currently does not export this functionUsageget_os()ValueOperating system name(character string).markedData Format Wildbook data for mark-recaputure analysisDescriptionFormat data from searchWB function in RWildbook package for mark-recapture analysis with marked and RMark packages.UsagemarkedData(data,varname_of_capturetime="dateInMilliseconds",varlist=c("individualID"),start.dates,end.dates=NULL,date_format="%Y-%m-%d",origin="1970-01-01",removeZeros=TRUE)4markedDataArgumentsdata The raw data set from searchWB function in RWildbook package.varname_of_capturetimeA character object which is the variable name for capture/encounter sightedtime.varlist A character vector of the names of variables for mark-recapture analysis.start.dates A character vector of dates which are the start dates of the capture occasions.The elements should be in the form of date_format.end.dates A character vector of dates which are the end dates of the capture occasions.The elements should be in the form of date_format.date_format The format for all the arguments of date value.origin A point of time which set to be zero in millisecond.removeZeros If TRUE(default)then individuals with no captures are removed from the data. DetailsThe markedData function format the wildbook data set that users search with the searchWB fucntion for the mark-recapture analysis with mark and RMark package.In marked package,users can pro-cess a certain form of data set with process.data function in marked package.The markedData function reutrns data set which can be the input data set of process.data.Defalut NULL value for end.dates argumentThe default value for end.date argument are NULL which means the capture occasion intervals are divided by the elements of start.date argument.In this case,the end date of the last capture occasion is the value of Sys.Date().The class of output The class of the ouput of markedData is"data.table"and"data.frame".With installing the data.table package,the ouput is a data.table,otherwise it is a data.frame.That means users can process the data with data.table package.Also users can directly process the output with process.data function in marked package.Examples##Not run:##You will need to supply your own login information for to##run these examples.##Load packageslibrary(marked)##Extract data for individual A-001through A-099data1<-searchWB(username="username",password="password",baseURL="",object="Encounter",individualID=paste0("A-0",rep(0:9,rep(10,10)),rep(0:9,10))[-1]) ##Define start and end dates of capture occasionsRWildbook5 start.dates1<-paste0(1998:2016,"-01-01")#Define the start.date valueend.dates1<-paste0(1998:2016,"-04-01")#Define the end.date value##Format data for use in markedmarkedData1.1<-markedData(data=data1,varname_of_capturetime="dateInMilliseconds",varlist=c("individualID"),start.dates=start.dates1,end.dates=NULL,date_format="%Y-%m-%d",origin="1970-01-01",removeZeros=TRUE)##Fit simple CJS model in markedmarkedData1.proc=process.data(markedData1.1,model="CJS",begin.time=1)markedData1.ddl=make.design.data(markedData1.proc)markedData1.cjs=crm(markedData1.proc,markedData1.ddl,model.parameters=list(Phi=list(formula=~time),p=list(formula=~time)))##Format data including location as a covariatemarkedData1.2<-markedData(data=data1,varname_of_capturetime="dateInMilliseconds",varlist=c("individualID","locationID"),start.dates=start.dates1,end.dates=end.dates1,date_format="%Y-%m-%d",origin="1970-01-01")##End(Not run)RWildbook RWildbookDescriptionThe primary objective of this package is to provide an R interface with the Wildbook mark-recapture ecological database framework.It helps users to pull data from the Wildbook framework and for-mat data for further analysis with mark-recapture applications like Program MARK(which can be accessed via the RMark package in R).searchWB Pull data from the Wildbook framework.DescriptionThis function allows users to pull data from the Wildbook framework into R.UsagesearchWB(searchURL=NULL,username=NULL,password=NULL,baseURL,jdoql=NULL,protocol="https",object="encounter",location=NULL,locationID=NULL,sighting_date=c("1964-01-01","2016-12-31"),encounter_submission_date=c("2003-01-01","2016-12-31"),date_format="%Y-%m-%d",sex=c("male","female","unknown"),status=c("alive","dead"),measurement=NULL,individualID=NULL,encounterID=NULL,encounter_type=NULL,Date_of_birth=NULL,Date_of_death=NULL,jsonfile=NULL,showJDOQL=FALSE,showURL=TRUE)ArgumentssearchURL A character object of the URL for data searching in the Wildbook framework.username A character object of the username in the Wildbook framework.password A character object of the password in the Wildbook framework.baseURL A character object of the base URL represent the Wildbook data base.jdoql A character object of the JDOQL string for data searching.protocol Defines communication protocol.either"http"or"https"(default).object A character object for defining the the search type.The value can be either "encounter"for the encounter search or"individual"for the individual search.The default value is"encounter"for encounter search.location A vector of character strings for searching encounters in locations containing the character strings.locationID A character vector for searching encounters in locations with specified loca-tionID.Note that the location ID is case sensitive.sighting_date A character vector forfiltering encounters which are sighted during a period of time.More information of the date argument can be found in the Detail section.encounter_submission_dateA character vector forfiltering encounters which are submitted during a periodof time.date_format The format for all the arguments of date value.sex A character vector of maximum size of three representing the values for the sex filter.The value can be any combination of"male","female"and"unknown".The default value is"sex=c("male","female","unknown")".status A character vector of maximum size of two representing the values for the en-counter status.The value can be any combination of"alive"and"dead".measurement A numeric object sets the minimum individual measurement when searching in the Wildbook framework.individualID A character vector of individual ID for searching data of specified individual IDs.Note that the individual ID is case sensitive.encounterID A character vector for searching data of specific encounter ID.Note that the encounter ID is case sensitive.encounter_type A character vector of maximum size of three for searching data with specific encounter type.It can be any combination of"unapproved","approved"and"unidentifiable".Date_of_birth A character vector for searching data of individual which is born during a period of time.Date_of_death A character vector for searching data of individual which died during a period of time.jsonfile offile in which JSOn formatted data from Wildbook will be stored.If NULL(default)then data is stored in a temporaryfile generated by R.showJDOQL logical.If FALSE(default)the function will not return the search JDOQL,oth-erwise the function returns the search JDOQL.showURL logical.If TRUE(default)the function returns the search URL,otherwise the function will not return the search URL.DetailsThe searchWB function provides the main interface to the Wilbook framework and can be used in one of three ways.First,users may supplyfilters based on the variables within the database.These filters are combined in a single JDOQL statement which is then combined with the base URL, username,and password to create the URL search string for data extraction.Second,users may supply the JDOQL string,username and password,and base URL as separate arguments.Finally, users may supply the search URL directly.We envisage that most users will supplyfilters to create the search URL.The other options allow users to easily repeat or modify previous searches and enable advanced users familiar with the JDOQL API and internals of the Wildbook framework to conduct more complex searches.More examples of extracting data from the Wildbook framework with the searchWB function can be found in the rwildbook-demo-1of the RWildbook package.Filtering LocationsLocations may befiltered with either location names or location ids.Multiple location names can be given to the location argument.Multiple location ids can be given to the locationID argument.In this case the search will return all objects(encounters or individuals)matching at least one of the locations.Filtering DatesThe sighting_datefilter may be specified as a character vector of either one or two elements rep-resenting dates.If one date is provided then results will befiltered from00:00:00to24:00:00 on that day.If two dates are provided then results will befiltered from00:00:00on thefirst8sexstring date to24:00:00on the second date.By default,dates must be entered using the"YYYY-MM-DD"format.Other formats may be used by specifying the value of date_format.More details about the date format can be found in the help page of as.Date The same rule can apply to the encounter_submission_date,Date_of_birth and Date_of_deathfilters.Defalut NULL value forfilter argumentsThe default value for somefilter arguments are NULL.NULL value for afilter argument returns data notfiltering that argument.Examples##Not run:##The following examples conduct the same search.##You will need to supply your own login information for to##run these examples.##Search using filter argumentsdata1<-searchWB(username="username",password="password",baseURL="",object="Encounter",individualID=c("A-001"))##Search using existing JDOQL stringjdoql<-"SELECT FROM org.ecocean.Encounter WHERE individualID== A-001 "data2<-searchWB(username="username",password="password",baseURL="",jdoql=jdoql)##Search using existing URLWBurl<-paste0("http://username:***********************/rest/jdoql?",jdoql)data3<-searchWB(searchURL=WBurl)##End(Not run)sexstring Generate the JDOQL part for the sexfilter.DescriptionThis function is to generate the sex related portion in the JDOQL query.Usagesexstring(sex)vignette_1_data9Argumentssex The value for the sexfilter.vignette_1_data Data for thefirst vignette.DescriptionSample data for thefirst vignette.Originally pulled from ,the data has been modified to protect the innocent.Usagevignette_1_dataFormatAn object of class data.frame with69rows and65columns.Sourcevignette_2_data Data for the second vignette.DescriptionSample data for thefirst vignette.Originally pulled from ,the data has been modified to protect the innocent.Usagevignette_2_dataFormatAn object of class data.frame with1016rows and65columns.Source10WBjdoql WBjdoql Generate the JDOQL query for the search in the Wildbook framework.DescriptionThis function generate the JDOQL query string according to thefilters specified by users.The JDOQL query is an essential part of the search URL.UsageWBjdoql(object="encounter",location=NULL,locationID=NULL,sighting_date=c("1964-01-01","2016-12-31"),encounter_submission_dates=c("2003-01-01","2016-12-31"),date_format="%Y-%m-%d",sex=c("male","female","unknown"),status=c("alive","dead"),measurement=NULL,individualID=NULL,encounterID=NULL,encounter_type=NULL,Date_of_birth=NULL,Date_of_death=NULL)Argumentsobject can be either"encounter"for the encounter search or"individual"for the indi-vidual search.location A string of character contained in location names.locationID A character vector forfiltering the locationID.sighting_date A character forfiltering encounters which are sighted during a period of time.encounter_submission_datesA character forfiltering encounters which are submitted during a period of time.date_format The format for all the arguments of date valule.sex A character vector of maximum size of three represents the value for the sex filter.status A character vector of maximum size of two represents the value for the en-counter status.measurement A numeric object sets the minimum individual measurement when searching in the Wildbook framework.individualID A character vector for searching data of specific individual ID.encounterID A character vector for searching data of specific encounter ID.encounter_type A character vector of maximum size of three for searching data with specific encounter type.Date_of_birth A character vector for searching data of individual which is borned during a period of time.Date_of_death A character vector for searching data of individual which is dead during a period of time.WBsearchURL11 WBsearchURL Generate the search URL given the JDOQL query.DescriptionThis function helps users to generate the URL for data searching in the Wildbook framework with the account information of Wildbook,the URl of the desire wildbook and the JDOQL query. UsageWBsearchURL(username,password,baseURL,jdoql,protocol="https")Argumentsusername The username in the Wildbook framework.password The password in the Wildbook framework.baseURL The URL represent the desire wildbook data base.jdoql The JDOQL string for data searching.protocol Defines communication protocol.either"http"or"https"(default).Index∗datasetsvignette_1_data,9vignette_2_data,9 dateTOmillisecond,2filterstring,2get_os,3markedData,3RWildbook,5RWildbook-package(RWildbook),5 searchWB,6sexstring,8vignette_1_data,9vignette_2_data,9WBjdoql,10WBsearchURL,1112。
森林控制包用户指南说明书
Package‘forestControl’October13,2022Type PackageTitle Approximate False Positive Rate Control in Selection Frequency for Random ForestVersion0.2.2Date2022-02-09Description Approximate false positive rate control in selection frequency for random forest using the methods described by En-der Konukoglu and Melanie Ganz(2014)<arXiv:1410.2838>.Methods for calculating the selection frequency threshold at false positive rates and selection frequency false positive rate feature selection.Imports Rcpp,purrr,tibble,magrittr,dplyrSuggests testthat,randomForest,ranger,parsnip,knitr,rmarkdownLicense MIT+file LICENSEEncoding UTF-8URL https:///aberHRML/forestControlBugReports https:///aberHRML/forestControl/issues RoxygenNote7.1.1LinkingTo RcppVignetteBuilder knitrNeedsCompilation yesAuthor Tom Wilson[aut,cre](<https:///0000-0003-3112-4682>), Jasen Finch[aut]Maintainer Tom Wilson<************.uk>Repository CRANDate/Publication2022-02-0910:50:02UTC12extract_params R topics documented:forestControl-package (2)extract_params (2)fpr_fs (3)selection_freqs (4)sft (4)Index6 forestControl-package False Positive Rate Control in Selection Frequency for Random ForestDescriptionThis package is an implementation of the methods described by Ender Konukoglu and Melanie Ganz in Konukoglu,E.and Ganz,M.,2014.Approximate false positive rate control in selection frequency for random forest.arXiv preprint arXiv:1410.2838https:///abs/1410.2838.extract_params Extract forest parametersDescriptionFor a randomForest or ranger classification object,extract the parameters needed to calculate an approximate selection frequency thresholdUsageextract_params(x)Argumentsx a randomForest,ranger or parsnip objectValuea list of four elements•Fn The number of features considered at each internal node(mtry)•Ft The total number of features in the data set•K The average number of binary tests/internal nodes across the enitre forest•Tr The total number of trees in the forestfpr_fs3 Author(s)Tom Wilson<************.uk>Exampleslibrary(randomForest)data(iris)iris.rf<-randomForest(iris[,-5],iris[,5],forest=TRUE)iris.params<-extract_params(iris.rf)print(iris.params)fpr_fs False Postivie Rate Feature SelectionDescriptionCalculate the False Positive Rate(FPR)for each feature using it’s selection frequencyUsagefpr_fs(x)Argumentsx a randomForest or ranger objectValuea tibble of selection frequencies and their false positive rateAuthor(s)Jasen Finch<************.uk>Exampleslibrary(randomForest)data(iris)iris.rf<-randomForest(iris[,-5],iris[,5],forest=TRUE)iris.features<-fpr_fs(iris.rf)print(iris.features)4sft selection_freqs Variable Selection FrequenciesDescriptionExtract variable selection frequencies from randomForest and ranger model objectsUsageselection_freqs(x)Argumentsx a randomForest or ranger objectValuetibble of variable selection frequenciesExampleslibrary(randomForest)data(iris)iris.rf<-randomForest(iris[,-5],iris[,5],forest=TRUE)iris.freqs<-selection_freqs(iris.rf)print(iris.freqs)sft Selection Frequency ThresholdDescriptionDetermine the selecton frequency threshold of a model at a specified false positive rateUsagesft(x,alpha)Argumentsx a randomForest or ranger objectalpha a false positive rate(ie,0.01)sft5Valuea list of two elements•sft Tthe selection frequency threshold•probs_atsft The esimated false positive rateAuthor(s)Tom Wilson<************.uk>Exampleslibrary(randomForest)data(iris)iris.rf<-randomForest(iris[,-5],iris[,5],forest=TRUE)#For a false positive rate of1%iris.sft<-sft(iris.rf,0.01)print(iris.sft)#To iterate through a range of alpha valuesalpha<-c(0.01,0.05,0.1,0.15,0.2,0.25)threshold<-NULLfor(i in seq_along(alpha)){threshold[i]<-sft(iris.rf,alpha[i])$sft}plot(alpha,threshold,type= b )Indexextract_params,2forestControl-package,2fpr_fs,3selection_freqs,4sft,46。
intensegRid包的中文说明书
Package‘intensegRid’November8,2022Type PackageTitle R Wrapper for the Carbon Intensity APIVersion0.1.2Author Kasia Kulma<*************************>Maintainer Kasia Kulma<*************************>Description Electricity is not made equal and it vary in its carbon footprint(or carbon intensity) depending on its source.This package enables to access and query data provided by theCarbon Intensity API(<https:///>).National Grid’s Carbon Intensity APIprovides an indicative trend of regional carbon intensity of the electricity system in Great Britain. License CC0Encoding UTF-8LazyData TRUEURL https:///KKulma/intensegRid,https://kkulma.github.io/intensegRid/articles/intro-to-carbon-intensity.html BugReports https:///KKulma/intensegRid/issuesRoxygenNote7.2.1Depends R(>=2.10)Imports dplyr,httr,jsonlite,lubridate,magrittr,tidyr,tibble,rlang,purrrSuggests utils,knitr,rmarkdown,testthat(>=2.1.0),covr,vcrVignetteBuilder knitrNeedsCompilation noRepository CRANDate/Publication2022-11-0810:50:06UTC12get_british_ci R topics documented:clean_colnames (2)get_british_ci (2)get_data (3)get_factors (4)get_mix (4)get_national_ci (5)get_postcode_ci (5)get_regional_ci (6)get_stats (7)regions_lookup (7)Index9 clean_colnames Tidy up intensity results column namesDescriptionTidy up intensity results column namesUsageclean_colnames(result)Argumentsresult a data frame with raw results from Carbon Intensity APIValuedata frameget_british_ci Fetch British carbon intensity data for specified time periodDescriptionFetch British carbon intensity data for specified time periodUsageget_british_ci(start=NULL,end=NULL)get_data3 Argumentsstart character A start date of the intesity.end character An end date of the intesity data.The maximum date range is limited to14days.Valuea data.frame with1/2-hourly carbon intensity data for specified time periodExamples##Not run:get_british_ci()get_british_ci(start= 2019-01-01 ,end= 2019-01-02 )##End(Not run)get_data Retrieve raw data from Carbon Intensity APIDescriptionRetrieve raw data from Carbon Intensity APIUsageget_data(call)Argumentscall character API URLValuetibble4get_mix get_factors Get Carbon Intensity factors for each fuel typeDescriptionGet Carbon Intensity factors for each fuel typeUsageget_factors()Valuea tibbleExamplesget_factors()get_mix Get generation mix for current half hourDescriptionGet generation mix for current half hourUsageget_mix(start=NULL,end=NULL)Argumentsstart character A start date of the intesity dataend character An end date of the intesity dataValuetibbleExamples##Not run:start<-"2019-04-01"end<-"2019-04-07"get_mix(start,end)get_mix()##End(Not run)get_national_ci5get_national_ci Get Carbon Intensity data for current half hour for a specified GBRegionDescriptionGet Carbon Intensity data for current half hour for a specified GB RegionUsageget_national_ci(start=NULL,end=NULL,region=NULL)Argumentsstart character A start date of the intesityend character An end date of the intesity dataregion character The name of the GB region,one of’England’,’Scotland’or’Wales’Valuea tibbleExamples##Not run:get_national_ci()get_national_ci(region= England )get_national_ci(region= Scotland )get_national_ci(region= Wales )get_national_ci(start= 2019-01-01 ,end= 2019-01-02 )##End(Not run)get_postcode_ci Get Carbon Intensity for specified postcode.DescriptionGet Carbon Intensity for specified postcode.Usageget_postcode_ci(postcode,start=NULL,end=NULL)6get_regional_ciArgumentspostcode character Outward postcode i.e.RG41or SW1or TF8.Do not include full postcode,outward postcode onlystart character A start date of the intesity dataend character An end date of the intesity dataValuetibbleExamples##Not run:get_postcode_ci(postcode= EN2 )get_postcode_ci(postcode= EN2 ,start= 2019-01-01 ,end= 2019-01-02 )##End(Not run)get_regional_ci Get Carbon Intensity data between specified datetimes for specifiedregionDescriptionGet Carbon Intensity data between specified datetimes for specified regionUsageget_regional_ci(region_id,start=NULL,end=NULL)Argumentsregion_id numeric Region ID in the UK region.See list of Region IDs in regions_lookup start character A start date of the intesity dataend character An end date of the intesity dataValuea tibbleExamples##Not run:get_regional_ci(13)get_regional_ci(13,start= 2019-01-02 ,end= 2019-01-03 )##End(Not run)get_stats7 get_stats Get Carbon Intensity statistics between from and to datesDescriptionGet Carbon Intensity statistics between from and to datesUsageget_stats(start,end,block=NULL)Argumentsstart character A start date of the stats data.The maximum date range is limited to30 days.end character An end date of the stats data.The maximum date range is limited to 30days.block numeric Block length in hours i.e.a block length of2hrs over a24hr period returns12items with the average,max,min for each2hr blockValuetibbleExamples##Not run:start<-"2019-04-01"end<-"2019-05-01"get_stats(start,end)get_stats(start,end,block=2)##End(Not run)regions_lookup regions_lookupDescriptionA lookup table of region_ids and corresponding GB regionsUsageregions_lookup8regions_lookupFormatA data frame with17rows and2variables:Region ID region ID to be used in intensegRid packageShortname corresponding GB regionSourcehttps://carbon-intensity.github.io/api-definitions/#region-listIndex∗datasetsregions_lookup,7clean_colnames,2get_british_ci,2get_data,3get_factors,4get_mix,4get_national_ci,5get_postcode_ci,5get_regional_ci,6get_stats,7regions_lookup,79。
基础函数计算基础设施版本1.1-4说明书
Package‘basefun’May16,2023Title Infrastructure for Computing with Basis FunctionsVersion1.1-4Date2023-05-15Description Some very simple infrastructure for basis functions.Depends variables(>=1.1-0),R(>=3.2.0)Imports stats,polynom,Matrix,orthopolynom,methodsSuggests coneprojURL License GPL-2NeedsCompilation yesAuthor Torsten Hothorn[aut,cre](<https:///0000-0001-8301-0471>)Maintainer Torsten Hothorn<*****************************>Repository CRANDate/Publication2023-05-1615:30:05UTCR topics documented:basefun-package (2)as.basis (2)b (4)Bernstein_basis (5)c.basis (6)intercept_basis (7)Legendre_basis (8)log_basis (9)polynomial_basis (10)predict.basis (11)Index131basefun-package General Information on the basefun PackageDescriptionThe basefun package offers a small collection of objects for handling basis functions and corre-sponding methods.The package was written to support the mlt package and will be of limited use outside this package.Author(s)This package is authored by Torsten Hothorn<*****************************>.ReferencesTorsten Hothorn(2018),Most Likely Transformations:The mlt Package,Journal of Statistical Software,forthcoming.URL:https:///package=mlt.docregas.basis Convert Formula or Factor to Basis FunctionDescriptionConvert a formula or factor to basis functionsUsageas.basis(object,...)##S3method for class formulaas.basis(object,data=NULL,remove_intercept=FALSE,ui=NULL,ci=NULL,negative=FALSE,scale=FALSE,Matrix=FALSE,prefix="",...)##S3method for class factor_varas.basis(object,...)##S3method for class ordered_varas.basis(object,...)##S3method for class factoras.basis(object,...)##S3method for class orderedas.basis(object,...)Argumentsobject a formula or an object of class factor,factor_var,ordered or ordered_var data either a vars object or a data.frameremove_intercepta logical indicating if any intercept term shall be removedui a matrix defining constraintsci a vector defining constraintsnegative a logical indicating negative basis functionsscale a logical indicating a scaling of each column of the model matrix to the unitinterval(based on observations in data)Matrix a logical requesting a sparse model matrix,that is,a Matrix object.prefix character prefix for model matrix column names(allows disambiguation of pa-rameter names)....additional arguments to model.matrix,for example contrastsDetailsas.basis returns a function for the evaluation of the basis functions with corresponding model.matrix and predict methods.Unordered factors(classes factor and factor_var)use a dummy coding and ordered factor(classes ordered or ordered_var)lead to a treatment contrast to the last level and removal of the intercept term with monotonicity constraint.Additional arguments(...)are ignored for ordered factors.Linear constraints on parameters parm are defined by ui%*%parm>=ci.Examples##define variables and basis functionsv<-c(numeric_var("x"),factor_var("y",levels=LETTERS[1:3]))fb<-as.basis(~x+y,data=v,remove_intercept=TRUE,negative=TRUE,contrasts.arg=list(y="contr.sum"))##evaluate basis functionsmodel.matrix(fb,data=as.data.frame(v,n=10))##basically the same as(but wo intercept and times-1)model.matrix(~x+y,data=as.data.frame(v,n=10))###factorxf<-gl(3,1)model.matrix(as.basis(xf),data=data.frame(xf=xf))###orderedxf<-gl(3,1,ordered=TRUE)model.matrix(as.basis(xf),data=data.frame(xf=unique(xf)))4b b Box Product of Basis FunctionsDescriptionBox product of two basis functionsUsageb(...,sumconstr=FALSE)Argumentsd objects of class basissumconstr a logical indicating if sum constraints shall be appliedDetailsb()joins the corresponding design matrices by the row-wise Kronecker(or box)product. Examples###set-up a Bernstein polynomialxv<-numeric_var("x",support=c(1,pi))bb<-Bernstein_basis(xv,order=3,ui="increasing")##and treatment contrasts for a factor at three levelsfb<-as.basis(~g,data=factor_var("g",levels=LETTERS[1:3]))###join them:we get one intercept and two deviation_functions_bfb<-b(bern=bb,f=fb)###generate data+coefficientsx<-expand.grid(mkgrid(bfb,n=10))cf<-c(1,2,2.5,2.6)cf<-c(cf,cf+1,cf+2)###evaluate basesmodel.matrix(bfb,data=x)###plot functionsplot(x$x,predict(bfb,newdata=x,coef=cf),type="p",pch=(1:3)[x$g])legend("bottomright",pch=1:3,legend=colnames(model.matrix(fb,data=x)))Bernstein_basis5 Bernstein_basis Bernstein Basis FunctionsDescriptionBasis functions defining a polynomial in Bernstein formUsageBernstein_basis(var,order=2,ui=c("none","increasing","decreasing","cyclic","zerointegral","positive","negative","concave","convex"),extrapolate=FALSE,log_first=FALSE)Argumentsvar a numeric_var objectorder the order of the polynomial,one defines a linear functionui a character describing possible constraintsextrapolate logical;if TRUE,the polynomial is extrapolated linearily outside support(var).In particular,the second derivative of the polynomial at support(var)is con-strained to zero.log_first logical;the polynomial in Bernstein form is defined on the log-scale if TRUE.It makes sense to define the support as c(1,q)$,ie putting thefirst basis functionof the polynomial on log(1).DetailsBernstein_basis returns a function for the evaluation of the basis functions with corresponding model.matrix and predict methods.ReferencesRida T.Farouki(2012),The Bernstein Polynomial Basis:A Centennial Retrospective,Computer Aided Geometric Design,29(6),379–419./10.1016/j.cagd.2012.03.001Examples###set-up basisbb<-Bernstein_basis(numeric_var("x",support=c(0,pi)),order=3,ui="increasing")###generate data+coefficientsx<-as.data.frame(mkgrid(bb,n=100))cf<-c(1,2,2.5,2.6)6 c.basis###evaluate basis(in two equivalent ways)bb(x[1:10,,drop=FALSE])model.matrix(bb,data=x[1:10,,drop=FALSE])###check constraintscnstr<-attr(bb(x[1:10,,drop=FALSE]),"constraint")all(cnstr$ui%*%cf>cnstr$ci)###evaluate and plot Bernstein polynomial defined by###basis and coefficientsplot(x$x,predict(bb,newdata=x,coef=cf),type="l")###evaluate and plot first derivative of###Bernstein polynomial defined by basis and coefficientsplot(x$x,predict(bb,newdata=x,coef=cf,deriv=c(x=1)),type="l")###illustrate constrainted estimation by toy exampleN<-100order<-10x<-seq(from=0,to=pi,length.out=N)y<-rnorm(N,mean=-sin(x)+.5,sd=.5)if(require("coneproj")){prnt_est<-function(ui){xv<-numeric_var("x",support=c(0,pi))xb<-Bernstein_basis(xv,order=10,ui=ui)X<-model.matrix(xb,data=data.frame(x=x))uiM<-as(attr(X,"constraint")$ui,"matrix")ci<-attr(X,"constraint")$ciif(all(is.finite(ci)))parm<-qprog(crossprod(X),crossprod(X,y),uiM,ci,msg=FALSE)$thetahatelseparm<-coef(lm(y~0+X))plot(x,y,main=ui)lines(x,X%*%parm,col=col[ui],lwd=2)}ui<-eval(formals(Bernstein_basis)$ui)col<-1:length(ui)names(col)<-uilayout(matrix(1:length(ui),ncol=ceiling(sqrt(length(ui)))))tmp<-sapply(ui,function(x)try(prnt_est(x)))}c.basis Join Basis FunctionsDescriptionConcatenate basis functions column-wiseintercept_basis7Usage##S3method for class basisc(...,recursive=FALSE)Argumentsd objects of class basisrecursive always FALSEDetailsc()joins the corresponding design matrices column-wise,ie,the two functions defined by the two bases are added.Examples###set-up Bernstein and log basis functionsxv<-numeric_var("x",support=c(1,pi))bb<-Bernstein_basis(xv,order=3,ui="increasing")lb<-log_basis(xv,remove_intercept=TRUE)###join themblb<-c(bern=bb,log=lb)###generate data+coefficientsx<-as.data.frame(mkgrid(blb,n=100))cf<-c(1,2,2.5,2.6,2)###evaluate basesmodel.matrix(blb,data=x[1:10,,drop=FALSE])###evaluate and plot function defined by###bases and coefficientsplot(x$x,predict(blb,newdata=x,coef=cf),type="l")###evaluate and plot first derivative of function###defined by bases and coefficientsplot(x$x,predict(blb,newdata=x,coef=cf,deriv=c(x=1)),type="l")intercept_basis Intercept-Only Basis FunctionDescriptionA simple intercept as basis function8Legendre_basisUsageintercept_basis(ui=c("none","increasing","decreasing"),negative=FALSE)Argumentsui a character describing possible constraintsnegative a logical indicating negative basis functionsDetailsintercept_basis returns a function for the evaluation of the basis functions with corresponding model.matrix and predict methods.Examples###set-up basisib<-intercept_basis()###generate data+coefficientsx<-as.data.frame(mkgrid(ib))###2*1predict(ib,newdata=x,coef=2)Legendre_basis Legendre Basis FunctionsDescriptionBasis functions defining a Legendre polynomialUsageLegendre_basis(var,order=2,ui=c("none","increasing","decreasing","cyclic","positive","negative"),...) Argumentsvar a numeric_var objectorder the order of the polynomial,one defines a linear functionui a character describing possible constraints...additional arguments passed to legendre.polynomialslog_basis9DetailsLegendre_basis returns a function for the evaluation of the basis functions with corresponding model.matrix and predict methods.ReferencesRida T.Farouki(2012),The Bernstein Polynomial Basis:A Centennial Retrospective,Computer Aided Geometric Design,29(6),379–419./10.1016/j.cagd.2012.03.001Examples###set-up basislb<-Legendre_basis(numeric_var("x",support=c(0,pi)),order=3)###generate data+coefficientsx<-as.data.frame(mkgrid(lb,n=100))cf<-c(1,2,2.5,1.75)###evaluate basis(in two equivalent ways)lb(x[1:10,,drop=FALSE])model.matrix(lb,data=x[1:10,,drop=FALSE])###evaluate and plot Legendre polynomial defined by###basis and coefficientsplot(x$x,predict(lb,newdata=x,coef=cf),type="l")log_basis Logarithmic Basis FunctionDescriptionThe logarithmic basis functionUsagelog_basis(var,ui=c("none","increasing","decreasing"),remove_intercept=FALSE)Argumentsvar a numeric_var objectui a character describing possible constraintsremove_intercepta logical indicating if the intercept term shall be removed10polynomial_basisDetailslog_basis returns a function for the evaluation of the basis functions with corresponding model.matrix and predict methods.Examples###set-up basislb<-log_basis(numeric_var("x",support=c(0.1,pi)))###generate data+coefficientsx<-as.data.frame(mkgrid(lb,n=100))###1+2*log(x)max(abs(predict(lb,newdata=x,coef=c(1,2))-(1+2*log(x$x))))polynomial_basis Polynomial Basis FunctionsDescriptionBasis functions defining a polynomialUsagepolynomial_basis(var,coef,ui=NULL,ci=NULL)Argumentsvar a numeric_var objectcoef a logical defining the order of the polynomialui a matrix defining constraintsci a vector defining constraintsDetailspolynomial_basis returns a function for the evaluation of the basis functions with corresponding model.matrix and predict methods.Examples###set-up basis of order3ommiting the quadratic termpb<-polynomial_basis(numeric_var("x",support=c(0,pi)),coef=c(TRUE,TRUE,FALSE,TRUE))###generate data+coefficientsx<-as.data.frame(mkgrid(pb,n=100))cf<-c(1,2,0,1.75)###evaluate basis(in two equivalent ways)pb(x[1:10,,drop=FALSE])model.matrix(pb,data=x[1:10,,drop=FALSE])###evaluate and plot polynomial defined by###basis and coefficientsplot(x$x,predict(pb,newdata=x,coef=cf),type="l")predict.basis Evaluate Basis FunctionsDescriptionEvaluate basis functions and compute the function defined by the corresponding basisUsage##S3method for class basispredict(object,newdata,coef,dim=!is.data.frame(newdata),...)##S3method for class cbind_basespredict(object,newdata,coef,dim=!is.data.frame(newdata),terms=names(object),...)##S3method for class box_basespredict(object,newdata,coef,dim=!is.data.frame(newdata),...)Argumentsobject a basis or bases objectnewdata a list or data.framecoef a vector of coefficientsdim either a logical indicating that the dimensions shall be obtained from the bases object or an integer vector with the corresponding dimensions(the latter optionbeing very experimentalterms a character vector defining the elements of a cbind_bases object to be evaluated ...additional argumentsDetailspredict evaluates the basis functions and multiplies them with coef.There is no need to expand multiple variables as predict uses array models(Currie et al,2006)to compute the corresponding predictions efficiently.ReferencesIan D.Currie,Maria Durban,Paul H.C.Eilers,P.H.C.(2006),Generalized Linear Array Models with Applications to Multidimensional Smoothing,Journal of the Royal Statistical Society,Series B:Methodology,68(2),259–280.Examples###set-up a Bernstein polynomialxv<-numeric_var("x",support=c(1,pi))bb<-Bernstein_basis(xv,order=3,ui="increasing")##and treatment contrasts for a factor at three levelsfb<-as.basis(~g,data=factor_var("g",levels=LETTERS[1:3]))###join them:we get one intercept and two deviation_functions_bfb<-b(bern=bb,f=fb)###generate data+coefficientsx<-mkgrid(bfb,n=10)cf<-c(1,2,2.5,2.6)cf<-c(cf,cf+1,cf+2)###evaluate predictions for all combinations in x(a list!)predict(bfb,newdata=x,coef=cf)##same but slowermatrix(predict(bfb,newdata=expand.grid(x),coef=cf),ncol=3)Index∗packagebasefun-package,2as.basis,2b,4basefun(basefun-package),2basefun-package,2Bernstein_basis,5c.basis,6intercept_basis,7legendre.polynomials,8Legendre_basis,8log_basis,9model.matrix,3numeric_var,5,8–10polynomial_basis,10predict.basis,11predict.box_bases(predict.basis),11 predict.cbind_bases(predict.basis),11 vars,313。
RTKLIB中文说明书
RTKLIB中文说明书1.文件目录结构\app-- APs构建环境 \bin --可执行二进制APs和windows链接库\data-- APs样本数据 \doc --文档文件\lib --库生成环境 \src --RTKLIB库的源程序\test--测试程序和数据 \util-- 实用程序工具2.\bin\rtklaunch.exe 应用程序启动器3.RTKNAVI实时定位结算输入GPS / GNSS接收机原始观测数据,实时进行导航处理。
3.1执行\bin\rtknavi.exe3.2用RTKNAVI进行实时定位必须输入GPS/GNSS接收机原始观测数据和卫星星历,点击I进入输入流对话框检查设置Rover、Basestation、Correction三个选项的设置,如果设置定位模式,只选择一个,基站和校正并不需要。
流类型可有从以下选项中选择(a)Serial :串口输入数据(b)TCP Client :连接到一个TCP服务器,通过TCP连接输入数据(c)TCP Server :接受一个TCP客户端连接和通过TCP连接的输入数据(d)NTRIP Client :连接一个NTRIP caster输入数据(e)File :日志文件中输入数据。
[.conf](f)FTP :通过FTP下载一个文件后输入数据(g)HTTP :通过(a) HTTP 下载一个文件后输入数据3.3选择流类型为?Serial?(连续的)点击...按钮设置选项3.4在流类型中如果你选择了SerialTCP Client或者TCP Server 作为类型,你可以通过流设置GPS / GNSS接收机启动和关闭命令,设置命令,按下“Cmd?标签下的…按钮。
在?Serial/TCP Commands?对话框中进行设置,可以加载和保存命令3.5流类型中设置类型为?File?可以设置文件输入路径,数据为原始数据,还可以设置时间3.6设置输出流格式,点击O按钮,弹出?Output Streams?对话框,设置类型,3.7类型选择?File?文件路径中的一些关键词将被日期和时间代替,按下按钮可以查看,选择?Swap Intv?输出文件在特定的周期内替换3.8输出一个输入流作为路径日志,点击按钮,弹出?Log Streams? 对话框,和?Output Streams?对画框一样路径被关键词替换3.9设置完成后,点击Start按钮。
- 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
- 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
- 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。
5GRLIB design concept5.1introductionGRLIB是一个可重用IP Core的集合,并分成了多个VHDL库。
每一个库提供了特定厂商的元件或者一系列共享的功能或接口。
在GRLIB设计中使用的数据结构和元件声明都是通过库指定的VHDL包来输出的。
GRLIB是基于AMBA AHB和APB片上总线的,并把该总线用作标准的互联接口。
AHB/APB总线的实现是与AMBA-2.0相兼容的,并附加了额外的“sideband”(边带)信号。
这些边带信号的有三个用途:automatic address decoding,interrupt steering和device identification(a.k.a plug&play support)。
根据AHB/APB 信号的功能,GRLIB的库把这些信号以VHDL records的形式组合在一起。
GRLIB AMBA包的源文件在lib/grlib/amba/下。
所有的GRLIB core都使用同样的data structures来声明AMBA接口,这样相互之间的连接就很容易了。
GRLIB库还包含了一个AHB bus controller和一个AHB/APB bridge,借助这两个模块,可以很快组装成一个全功能的AHB/APB的系统。
下面的部分将描述AMBA总线是怎么实现的以及怎样用GRLIB来建一个SOC设计。
5.2AMAB AHB on-chip bus5.2.1General(概述)AMBA Advanced High-performance Bus(AHB)是一个multi-master的总线,可以以high data rate and/or variable latency的形式来互连各单元。
图5就是一个概念图。
图中连在总线上的单元分为masters(主)和slaves(客),并都受一个全局的总线仲裁器(global bus arbiter)控制。
由于AHB总线是复用的(而不是三态的),更正确的总线与单元互连示图可以参考图6。
每一个master驱动了一系列以VHDL record形式的组合在一起的信号HMSTO。
当前总线master的输出record被总线复用器选中并被送到所有AHB slaves的input record(ahbsi)。
被激活的slave的output record(ahbso)被总线复用器(bus multiplexer)选中并输出到所有的masters。
一个组合的bus arbiter,address decoder and bus multiplexer 控制着哪个master和slave会被选中。
5.2.2AHB master interfaceAHB master的inputs、outputs都定义成VHDL record type形式,都以GRLIB AMBA库中TYPES package 的形式输出:--AHB master inputsType ahb_mst_in_type is recordhgrant:std_logic_vector(0to NAHBMST-1);--bus granthready:std_ulogic;--transfer donehresp:std_logic_vector(1downto0);--response typehrdata:std_logic_vector(31downto0);--read data bushcache:std_ulogic;--cacheablehirq:std_logic_vector(NABIRQ-1downto0);--interrupt result busend record;--AHB master outputstype ahb_mst_out_type is recordhbusreq:std_ulogic;--bus requesthlock:std_ulogic;--lock requesthtrans:std_logic_vector(1downto0);--transfer typehaddr:std_logic_vector(31downto0);--address bus(byte)hwrite:std_ulogic;--read/writehsize:std_logic_vector(2downto0);--transfer sizehburst:std_logic_vector(2downto0);--burst typehprot:std_logic_vector(3downto0);--protection controlhwdata:std_logic_vector(31downto0);--write data bushirq:std_logic_vector(NAHBIRQ-1downto0);--interrupt bushconfig:ahb_config_type;--memory access reg.hindex:integer range0to NAHBMST-1;--diagnostic use onlyend record;record type中的信号与AMBA2.0规范中AHB master的相应信号一致,但附加了四个边带信号:HCACHE,HIRQ,HCONFIG,HINDEX。
GRLIB中一个典型的AHB master定义如下:library grlib;use grlib.amba.all;library ieee;use ieee.std_logic.all;entity ahbmaster isgeneric(hindex:integer:=0);--master bus indexport(reset:in std_ulogic;clk:in std_ulogic;hmsti:in ahb_mst_in_type;--AHB master inputshmsto:out ahb_mst_out_type--AHB master outputs);end entity;输入record(HMSTI)接到各masters,当中包括对全部masters的bus grant(总线允许)信号HMSTI.HGRANT。
因此,一个AHB master必须使用一个generic(常量)来指定具体的哪一根HGRANT是要用的。
这个generic 的type是integer,通常叫作HINDEX(参见上面的例子)。
5.2.3AHB slave interface与AHB master的接口类似,AHB slaves的inputs和outputs也是定义成两个VHDL的records types: --AHB slave inputstype ahb_slv_in_type is recordhsel:std_logic_vector(0to NAHBSLV-1);--slave selecthaddr:std_logic_vector(31downto0);--address bus(byte)hwrite:std_ulogic;--read/writehtrans:std_logic_vector(1downto0);--transfer typehsize:std_logic_vector(2downto0);--transfer sizehburst:std_logic_vector(2downto0);--burst typehwdata:std_logic_vector(31downto0);--write data bushprot:std_logic_vector(3downto0);--protection controlhready:std_ulogic;--transfer donehmaster:std_logic_vector(3downto0);--current masterhmastlock:std_ulogic;--locked accesshbsel:std_logic_vector(0to NAHBCFG-1);--bank selecthcache:std_ulogic;--cacheablehirq:std_logic_vector(NAHBIRQ-1downto0);--interrupt result busend record;--AHB slave outputstype ahb_slv_out_type is recordhready:std_ulogic;--transfer donehresp:std_logic_vector(1downto0);--response typehrdata:std_logic_vector(31downto0);--read data bushsplit:std_logic_vector(15downto0);--split completionhcache:std_ulogic;--cacheablehirq:std_logic_vector(NAHBIRQ-1downto0);--interrupt bushconfig:ahb_config_type;--memory access reg.hindex:integer range0to NAHBSLV-1;--diagnostic use onlyend record;上面record中的信号与AMBA2.0规范中AHB slaves中对应的信号是一致的,额外附加了五个边带信号:HBSEL,HCACHE,HIRQ,HCONFIG,HINDEX。
GRLIB中典型的AHB slave定义如下:library grlib;use grlib.amba.all;library ieee;use ieee.std_logic.all;entity ahbslave isgeneric(hindex:integer:=0);--slave bus indexport(reset:in std_ulogic;clk:in std_ulogic;hslvi:in ahb_slv_in_type;--AHB slave inputshslvo:out ahb_slv_out_type--AHB slave outputs);end entity;input record(ahbsi)接到所有slaves,其中包括对所有slaves的选择信号ahbsi.hsel。
因此一个AHB slave必须使用一个generic(常量)来指定具体哪一个hsel是需要使用的。