2-1 perform meas

合集下载

inomax-使用说明书-中文

inomax-使用说明书-中文

Purpose To provide guidelines for the initiation of inhaled nitric oxide therapy via the iNOmax delivery system by completing the pre-use system purge andperformance test.Audience Physicians, Nursing staff, and Licensed Respiratory Care Practitioners. Scope Inhaled Nitric Oxide is a selective pulmonary vasodilator. Nitric Oxide, the active substance in iNOmax is a gaseous blend of 0.08% NO and 99.2%Nitrogen respectively for 800 ppm concentration.Physician's Physician orders must include the following:Order.Mechanical ventilation parameters•Nitric Oxide concentration in ppm•Methemoglobin levels are recommended at 1 hour and 8 hours afterinitiation of therapy and at physician discretion.Indications Inhaled nitric oxide therapy is indicated in term and near-term (>34 weeks) neonates with hypoxic respiratory failure that is associated with: (INOpackage insert 2014)•Meconium aspiration syndrome (MAS)•Pneumonia/ Sepsis•Persistent Pulmonary Hypertension of the Newborn (PPHN)•Congenital Diaphragmatic Hernia (CDH)•Respiratory Distress Syndrome (RDS)•Off label use for post-op Cardiac Patients such as LVAD•Respiratory Failure (ARDS)Goals The goals of delivering inhaled nitric oxide are to:•Improve and maintain oxygenation•Reduce the need for extracorporeal membrane oxygenation (ECMO)•Relieve primary pulmonary artery hypertensionThe adverse effects associated with inhaled nitric oxide therapy include:• Rebound (abrupt discontinuation of INO may leadto worsening oxygenation and increasing pulmonary artery pressure)• Methemoglobinemia (increases with dose of INO) • Increased levels of NO 2 • Inhaled nitric oxide therapy should not be used inpatie ntsthat are dependent on right-to-left shunting of blood•INOmax delivery system• Full NO cylindersProcedureAdverse EffectsEquipmentInitiation of Therapy with Conventional Mechanical Ventilation, Infant Nasal CPAP (NCPAP), and High Flow Nasal Cannula SystemsInitiation of Therapy with High Frequency Oscillatory VentilationAssessment 1.Arterial, venous and capillary blood gas valuesof Outcome 2.Pulse oximetry3. Pulmonary artery pressuresMonitoring during iNOThe risk of methemoglobinemia increases with the dose of nitric oxide. In clinical trials, the maximum methemoglobin levels were typically reached 8 hours after the initiation of therapy, though peak levels were seen as late as 40 hours after the start of therapy (iNO package insert, 2014).Infection Follow procedures as outlined Healthcare Epidemiology Policies and Control Procedures: #2.24 Respiratory Care Services./policy/hcepidem/search/02-24.pdfUse of Inhaled Nitric Oxide (iNO)Clinical Practice GuidelineDefinitions (Jumar 2007, Carriedo 2003)Responder: A patient is classified as a responder to iNO therapy if the post-ductal PaO2 increases 20mmHg within 30 minutes of starting iNO without any change in the inspired oxygen concentration. If there is no increase with the initial dose (5-15 ppm), iNO should be increased to 20 ppm.Non-Responder: If, after 30 minutes at 20 ppm, the post-ductal PaO2 does not increase>20mmHg, the patient is classified as a non-responder and therapy should be discontinued, as follows:•The patient should be classified as a responder or a non-responder after <1 hour of iNO.Non-responders should start the weaning process, as above.•An increase in O2 saturation >10% will be considered equivalent to an increase in PaO2 >20 mmHg.Weaning from iNO (Davidson, 1999)Responders may be weaned from iNO when any of the following conditions are met: o PaO2 >60 on FiO2 <.60o Oxygenation Index (OI) <10•If PaO2 or SpO2 falls >10% mmHg after any weaning step, the dose of iNO may be increased back to the previous dose, with appropriate increases in FiO204/19/18*Deviations from this strategy may be indicated if the patient is particularly sensitive to changes.Safety Oxygen safety techniques as outlined in the iNOmax Manual will be Precautions followed.All alarms on ventilators will be activated at all times.All alarms on INOvent delivery system will be activated at all times.References Sensormedics High Frequency Oscillatory Ventilator Operating Manual.Whitaker, K., Comprehensive Perinatal and Pediatric Respiratory Care,Delmar, Albany, New York, 2001.Cornfield DN, Abman SH. Inhalational nitric oxide in pulmonaryparenchymal and vascular disease. Journal of Laboratory Clinical Medicine1996 Jun; 127(6): 530-539.Miller CL. Nitric oxide therapy for the persistent pulmonary hypertension ofthe newborn. Neonatal Network 1995 Dec; 14(8): 9-15.Angelucci P. A new weapon against ARDS…adult respiratory distresssyndrome. RN 1996 Nov; 59(11): 22-25.ReferencesRespiratory Care. May 2015 Vol.60 No.5Barringtion K, Finer N. Inhaled Nitric Oxide for Preterm Infants: A Systematic Review. Pediatrics. 2007;120; 1088-1099.Carriedo H, Rhine W. Withdrawal of Inhaled Nitric Oxide from Non-Responders After Short Exposure.Journal of Perinatology. 2003; 23:556-558Davidson D, Barefield E, Kattwinkel J, et al. Safety of Withdrawing Inhaled Nitric Oxide Therapy inPersistent Pulmonary Hypertension of the Newborn. Pediatrics. 1999; 104; 231-236INOmax® [Package Insert]. Clinton, NJ: INO Therapeutics, Inc; 1999. Revised April 2004.Kumar VH, Hutchison AA, et al. Characteristics of Pulmonary Hypertension in Preterm Infants. Journal of Perinatology. 2。

《超级马里奥兄弟2》(美版)原版音乐钢琴谱 近藤浩治

《超级马里奥兄弟2》(美版)原版音乐钢琴谱 近藤浩治



9


13

5 1. 2.
Fast q = 145

mf








6

* I did not include sound effects not playable by the piano.
Original Super Mario Brothers BGM Property of Nintendo
Forward
I want to first of all, thank the people at Nintendo for creating this video game classic which I have fond memories of playing for many hours during my early teen years and way into my 20s. I want to especially thank the composer, Kondo Koji, for composing such memorable tunes which will remain within us to those who have played this wonderful game series. I first heard the piano arrangements of the Super Mario series on the internet played by The Blindfolded Pianist, a.k.a. Martin Leung, about 4‐5 years back and I was thrilled to hear these gems on the piano. As most of the people probably did, I looked for the sheet music but I couldn’t find the complete transcriptions/arrangements for the piano except the ones available done by Martin. It was recently that I watched Martin’s performance video again and it was then when I decided if I couldn’t purchase the scores I’ll do my own transcriptions and arrangements. So it is Mr. Martin Leung that I owe my inspiration to do my own transcriptions of these musical gems. Since I couldn’t find any “Officially” published music scores for these BMG (Background Music) for the Super Mario series, I did my searches on the net and got my hands on many midi files and nsf (Nintendo Sound Format) files to play and to listen in order to notate the music. As for notating the music, I used Sibelius notation software to make the scores. It has taken me many hours listening, transcribing, arranging, notating, and editing the scores. I have tried to be as faithful to the original music as possible with some additional elaborations and extensions done by me. As for the level of performance difficulty, most of these are HARD! I arranged these as “Concert Transcriptions”, which are usually technically very difficult and needs to be practiced very diligently. With diligent and hard practice , the result of your labor should be fruitful. With this being said, I hope you enjoy these gems. p.s. Please show your appreciation by mentioning and crediting me as the transcriber/arranger if you ever perform these or record. Thank you. Philip Kim フィリップ March, 2007 pskim71@

执行时间(latency等待时间)

执行时间(latency等待时间)
n 50
3
Make the Common Case Fast
Perhaps the most important and pervasive principle of computer design is to make the common case fast: In making a design trade-off, favor the frequent case over the infrequent case. 计算机设计的最重要的原则就是:加快经常性发 生事件的执行速度。
dOenseigpnroepdosfoalrisgrtaopehnihcasn.ce the FPSQR hardware
Aanmddsaphele’sd Luapwth也is可op以er用at于ion比b较y 两a f种ac设to计r o不f 同10的. The
CoPthUe,r a特lte别rn是at对ive于is处ju理st图to形t的ry处to理m器ak来e 说all,FP求浮点
6
Amdahl’s Law 2
Speedup (加速比) = Performance for entire task using the enhancement when possible(改进后完成整个任务的性能) Performance for entire task w/o using the enhancement (改进前完成整个任务的性能) = Execution time for entire task w/o using the enhancement(改进前完成整个任务的时间) Execution time for entire task using the enhancement when possible (改进前完成整个任务的时间)

安东尼·波尔碳类分布仪产品介绍说明书

安东尼·波尔碳类分布仪产品介绍说明书

Carbon Type Distribution of Petroleum Oils with SVM™ 4001and AbbematRelevant for: Petroleum Industry - Research, production and incoming quality control of base oils, lube oilsand process oils.Measure the required parameters and calculate carbon distribution and ring content of oilsaccording to ASTM D3238 in one go within minutes.1 Why determine carbon type distribution? The carbon type distribution serves to express the gross composition of the heavier fractions of petroleum into paraffinic, naphthenic, and aromatic components. It is one of the most important parameters for the qualification of base oils, lube oils, process oils, or plasticizer because it directly correlates to critical product performance properties. According to the standard ASTM D3238, the carbon distribution and ring content of olefin-free petroleum oils is calculated from measurements of refractive index, density and molecular weight (n-d-M method). The mean molecular weight can be calculated following ASTM D2502 from viscosity measurements at 37.78 °C and 98.89 °C (100 °F and 210 °F).So the following basic parameters are required: •kinematic viscosity at 37.78 °C and 98.89 °C (obtained from SVM™ 4001)•refractive index at 20 °C (obtained from the refractometer)•density at 20 °C (calculated from the measured density values by the SVM™ software)Further, the mean molecular weight is required. It is calculated from kinematic viscosity at 37.78 °C (100 °F) and 98.89 °C (210 °F) according toASTM D2502.From all these parameters, the carbon distribution (C A, C N, C P) and ring content (R T, R A, R N) are deter-mined according to the formulas in ASTM D3238. This report describes specifically how to test petroleum oils with the SVM™ 4001 (according to ASTM D7042, D4052 and D2502) in combination with an Abbemat refractometer to get the carbon type distribution according to ASTM D3238.2 Which instruments are used?For the viscosity and density measurement, theSVM™ 4001 Stabinger Viscometer™ with two measuring cells for simultaneous viscosity measurement at two temperatures is used.For the RI measurement, the Abbemat 550 is used. Connected via CAN interface, it is a module controlled by the SVM™ 4001 as master instrument.Tip: Any other Anton Paar refractometer from the Performance/ Performance Plus Line (300/350 or 500) or from the Heavy Duty line (450/650) can be used.3 Which samples are tested?Five oil samples as listed below were tested:SampleNytro 4000X Severely hydrotreated insulating oilT110 Severely hydrotreated base oil Nyflex 3150 Severely hydrotreated process oil Nypar 315 Severely hydrotreated process oil Samples were kindly provided by Nynas AB, Sweden.4 Sample measurement4.1 Instrument setupMethod: "SVM 4001 VI + Abbemat"SVM™ 4001:According to ASTM D7042, the following settings are predefined by default:•Measuring temperatures:Cell 1: 37.78 °C, Cell 2: 98.89 °C•Precision class "Precise"•RDV limit 0.10 %•RDD limit 0.0002 g/cm³• 5 determinations•Automatic prewetting: yes•Sulfur correction: activated (enter the value if the sulfur content is 0.8 % or higher to improve theaccuracy of the CTC calculation)•Drying time: 150 s (built-in air pump)when using compressed air at 2 bar: 60 s Abbemat refractometer:The method SVM + Abbemat includes the following settings for the refractometer:•Temperature: 20 °C•Measurement accuracy "Most Precise"•Hold time: 1 s•Timeout: 200 s•Wavelength: 589.3 nm (fixed parameter)4.2 CalibrationUse only a calibrated instrument. The calibration shall be performed periodically using certified reference standards. According to ASTM D7042, the reference standards shall be certified by a laboratory, which meets the requirements of ISO/IEC 17025 or a corresponding national standard. Viscosity standards should be traceable to master viscometer procedures. The uncertainty for density standards must not exceed 0.0001 g/cm³. For each certified value the uncertainty should be stated (k = 2; 95 % confidence level). Use one or more standard(s) in the viscosity range of your oil sample(s). If required, apply a calibration correction to improve the reproducibility. To perform calibration (correction), refer to the SVM X001 Reference Guide. For the refractometer perform at least a water check. For checks and adjustments of the Abbemat refer to the documentation of the Abbemat.4.3 Sample preparationIf the sample is not freshly drawn from a production line or else, homogenizing the test specimen may improve the measurement repeatability. For some samples degassing may be required. Refer to the SVM™ X001 Reference Guide.4.4 Filling10 mL single-use syringes are recommended to have enough sample for refills. Never use syringes with rubber seals, as the rubber is chemically not resistant and these syringes tend to draw bubbles.Ensure that the system (measuring cells and hoses) is leak tight, clean and dry. For flow-through filling, inject approx. 4.5 mL as first filling. After prewetting refill at least 1 mL or until the sample in the waste hose is free of bubbles. The typical amount for valid results is approx. 7 mL, where the volume can vary depending on the sample.4.5 Cleaning4.5.1 SolventsEnsure that the solvent starts boiling at a temperature higher than the measuring temperature. Otherwise a lack of cleaning in the hot upper cell may impact the measuring results.Petroleum benzine 100/140 (aliphatic hydrocarbon solvent mixture with a boiling range of 100 to 140 °C respectively 212 to 284 °F) is a universal solvent, suitable for most oils.Some oils may require an aromatic solvent, as they are not completely soluble in petroleum benzine. If so, use toluene or xylene as first solvent and the aliphatic hydrocarbon solvent as drying solvent.Avoid using acetone or ethanol, as these solvents start boiling below the temperature of the upper cell and as they are not suitable for most oils.For details, see the SVM™ X001 Reference Guide. 4.5.2 Cleaning Procedure•Tap the cleaning button to open the cleaning screen. Observe it during cleaning to get infor-mation on the cleaning status of the SVM™. •Remove the sample from the cells (push through using an air-filled syringe).•Fill approx. 5 mL of solvent using a syringe and leave the syringe connected (a 5 mL syringe forworks well for cleaning purposes).•Tap the motor speed button to improve the cleaning performance in the viscosity cell and stop it again.•Move the plunger of the syringe back and forth (motor at filling speed) to improve the cleaningperformance in the cells of SVM™ and Abbemat. •Blow air for some seconds through the cells to remove the sample-solvent-mixture.•Repeat the procedure until the liquid has reached approx. the solvent’s viscosity while the motor isturning at high speed.•Perform a final flush with a drying solvent to remove any residues.•Observe the cleaning screen. Dry the measuring cells until the cleaning value turns green and stays steadily green.•Set a sufficiently long drying time to ensure that also the Abbemat cell (at 20 °C) is completely dry. For details, see the SVM™ X001 Reference Guide.5 ResultsFor this report, the measurement and calculation results obtained from SVM™ 4001 and Abbemat 550 and the reference values on the respective data sheets (PDS, CoA) are compared.Carbon type analysis:Carbon distribution:T110 13.80 34.73 51.40 Nypar 315 0.20 30.55 69.23 Nyflex 3150 9.63 29.03 61.30 Nytro 4000X 2.35 47.30 50.40 Table 1: ASTM D3238 (n-d-M) Carbon distribution (mean of 4measurements)Ring content:T110 3.00 0.68 2.32 Nypar 315 1.58 0.01 1.57 Nyflex 3150 2.95 0.58 2.37 Nytro 4000X 1.96 0.08 1.88 Table 2: ASTM D3238 (n-d-M) Ring content (mean of 4measurements)Carbon distribution, deviations:T110 IR: -1.20D2140: 2.804.28 -1.40 Nypar 315 Fulfilled ** 3.45 4.22Nyflex 3150 IR: 0.63D2140: 2.63-3.98 1.30 Nytro 4000X IR: -1.65 IR: 2.30 IR: -0.60Table 3: Deviation to typical sample values *(dev. in percentage points)* Reference values / typical values were obtained by different methods. Where not mentioned, the value was determined by ASTM D2140.** Value must be < 1.Refractive Index:Sample RI meas. [nD] RI typ. [nD] Dev. [nD] T110 1.5035 1.502 0.0015 Nypar 315 1.4681 1.468 0.0001 Nyflex 3150 1.4949 1.494 0.0009 Nytro 4000X 1.4746 n.a. n.a. Table 4: Refractive Index and deviation to typical values at 20 °C ASTM D2502 Mean Molecular Mass:[g/mol] rangemeets rangevalueT110 399.59 352 ... 408 OKNypar 315 371.59368 ... 392 OKNyflex 3150 494.49 468 ... 505 OKNytro 4000X 273.01 n.a. n.a.Table 5: Mean molecular mass6 ConclusionThe assembly of SVM™ 4001 with Abbemat is perfectly suitable for determining the carbon type analysis of petroleum oils, provided that all requirements according to section 4, "Sample measurement" are fulfilled.Figure 1: SVM™ 4001 with Abbemat 5507 Literature•ASTM D7042: Standard Test Method for Dynamic Viscosity and Density of Liquids by StabingerViscometer (and the Calculation of KinematicViscosity)•ASTM D3238: Standard Test Method for Calculation of Carbon Distribution and Structural Group Analysis of Petroleum Oils by the n-d-MMethod•ASTM D2502: Standard Test Method for Estimation of Mean Relative Molecular Mass ofPetroleum Oils from Viscosity Measurements •Anton Paar Application Report SVM™ 3001 with Abbemat for Transformer Oils Doc. No.D89IA013EN.Contact Anton Paar GmbHTel: +43 316 257-0****************************APPENDIXAppendix A. Carbon type analysisCarbon-type analysis expresses the average amount of carbon atoms which occur in aromatic, naphthenic and paraffinic structures, reporting•the percentage of the carbon atoms in aromatic ring structures (% C A),•the percentage in naphthenic ring structures (% C N) and•the percentage in paraffinic side chains (% C P). There are several physical property correlations for carbon type analysis.In this application report the n-d-M method (refractive index – density – mean relative molecular mass), standardized as ASTM D3238, is described. Besides, a further empiric procedure exists, the VGC-r i method (viscosity gravity constant – refractivity intercept), standardized as ASTM D2140.Why carbon type analysis?Base oils, process oils and other petroleum oils are checked for their carbon type distribution. Oils with specified carbon type distribution are designed for different industries. Carbon type analysis according to ASTM D3238 is further used to determine the quantification of aromatics in diesel fuel.Major groups for this kind of analysis are process oils. To know the carbon type analysis is important to improve product properties, the process efficiency and reliability. Process oils are used in various fields e.g.: •as plasticizer in the rubber and polymer industrye.g. for automotive tires, sealants, footwear orother rubber products. Properties of the ready touse product like elasticity, grip, durability, lowtemperature performance, environmentalsustainability on the one hand, further solvencyand compatibility with the rubber compound during production highly depend on the used process oil.Such oils can be aromatic, naphthenic or paraffinic types.•as textile auxiliary formulations in the production process of yarns. They are used to reducerespectively avoid dust formation, prevent wearand rupture of fibers, electrostatic charging andmore. Such oils should have very low aromatichydrocarbon content and a high viscosity index. •in the production of cosmetics. Such oils need to have very low aromatic hydrocarbon content andmust meet the requirements for medical white oil.Nevertheless, there are also process oils, which are analyzed according to ASTM D2140. ASTM D3238 (n-d-M)“Standard Test Method for Calculation of Carbon Distribution and Structural Group Analysis of Petroleum Oils by the n-d-M Method”This test method covers the calculation of the carbon distribution and ring content of olefin-free petroleum oils from measurements of refractive index, density and mean relative molecular mass.The refractive index and density of the oil are determined at 20 °C. The mean relative molecular mass is estimated from measurements of viscosity at 37.78 °C and 98.89 °C (100 °F and 210 °F).These data are then used to calculatethe carbon distributionpercentage of the total number of carbon atoms that are present in aromatic rings (% C A), naphthenic rings (% C N) and paraffinic chains (% C P) orthe ring analysisproportions of aromatic rings (R A) and naphthenic rings (R N), and paraffinic chains (C P) that would comprise a hypothetical mean molecule.ASTM D2502 - Mean relative molecular mass "Standard Test Method for Estimation of Molecular Weight (Relative Molecular Mass) of Petroleum Oils From Viscosity Measurements”The mean relative molecular mass is a fundamental physical constant that can be used in conjunction with other physical properties to characterize hydrocarbon mixtures.This procedure covers the estimation of the mean relative molecular mass of petroleum oils or hydrocarbon fractions from kinematic viscosity measurements at 37.78 °C and 98.89 °C."SVM™ 4001 VI + Abbemat" MethodBeside the measurement results of the incoming parameters for the carbon type analysis and the analysis results according to ASTM D3238, this method offers a lot of additional useful parameters characterizing your oil:•kinematic viscosity at 40 °C and 100 °C(extrapolated according to ASTM D341) •Viscosity Index (according to ASTM D2270) •Carbon type composition according toASTM D2140 including the viscosity-gravity-constant (VGC) following ASTM D2501 •Density 20 °C•API Spec. Gravity 15.56 °C (60 °F)•Viscosity Gravity Constant according toASTM D2501。

promega+msi分析系统, 1.2 版本+产品使用说明说明书

promega+msi分析系统, 1.2 版本+产品使用说明说明书

技术手册MSI Analysis System, Version 1.2 MSI分析系统, 1.2版本MD1641产品使用说明MSI 分析系统,1.2 版本1. 产品介绍 (2)1.A. MSI 分析系统 .............................................................................................................................................. 2 1.B. 微卫星不稳定性(MSI )简介 ................................................................................................................. 4 1.C. 内标(Internal Lane Standard 600, ILS 600) ........................................................................................ 4 2. 产品组分和储存条件 ............................................................................................................................................ 5 3. DNA 提取方法 ......................................................................................................................................................... 5 4. 用MSI 分析系统进行DNA 扩增 .. (6)4.A. 扩增体系的建立 ......................................................................................................................................... 6 4.B. 扩增循环参数的设置 ................................................................................................................................. 7 5. 使用ABI PRISM® 310遗传分析仪检测扩增片段 (9)5.A. Matrix 生成(或光谱校正) ...................................................................................................................... 9 5.B. 样品准备 ................................................................................................................................................... 10 5.C. 仪器准备 ................................................................................................................................................... 10 6. 用ABI PRISM® 3100遗传分析仪,用1.0.1或1.1版本的数据收集软件,检测扩增片段 (11)6.A. 光谱校正 ................................................................................................................................................... 11 6.B. 样品准备 ................................................................................................................................................... 12 6.C. 仪器准备 ................................................................................................................................................... 12 7. 使用ABI PRISM® 3100或3100-Avant 遗传分析仪,2.0版本的数据收集软件,或使用Applied Biosystems 3130或3130xl 遗传分析仪,检测扩增片段 ................................................................................................... 14 7.A. 光谱校正 ................................................................................................................................................... 15 7.B. 样品准备 ................................................................................................................................................... 15 7.C. 仪器准备 ................................................................................................................................................... 16 8. 数据分析 .. (17)8.A. MSI 分析概述 ............................................................................................................................................. 17 8.B. 将Panels 和Bins 文件导入GeneMapper®软件(4.0和4.1版本) ..................................................... 20 8.C. 用GeneMapper®软件(4.0和4.1版本)创建数据分析方法 .............................................................. 20 8.D. 创建片段大小标准(Size Standard ) ..................................................................................................... 22 8.E. 数据处理 ................................................................................................................................................... 23 8.F. 复审片段大小标准(Size Standard ) ...................................................................................................... 23 8.G. 复审样本数据分析 ................................................................................................................................... 23 9. 常见问题与解决方案 .......................................................................................................................................... 24 10. 附录 . (26)10.A. 相关产品 ................................................................................................................................................. 26 10.B. 单核苷酸重复基因座的等位基因频率 . (27)所有技术文献的英文原版均可在/protocols 获得。

Fluke 7526A 精密过程校准器说明书

Fluke 7526A 精密过程校准器说明书

FLUKE-71X.FLUKE-71XSales Guide7526A Precision Process CalibratorSales Training Guide – 7526A Precision Process CalibratorContents7526A PRECISION PROCESS CALIBRATOR (1)1.OBJECTIVE (3)2.INTRODUCTION (3)2.1.K EY F EATURES AT-A-GLANCE (3)2.2.P RODUCT P OSITIONING (3)2.3.7526A VS.525B (4)3.TARGET CUSTOMERS (4)4.KEY FEATURES (5)4.1.F EATURES AT-A-GLANCE (LEFT SIDE) (5)4.2.F EATURES AT-A-GLANCE (RIGHT SIDE) (6)4.3.S TANDBY /O PERATE MODE (7)4.4.HART C OMMUNICATIONS AND L OOP P OWER (7)4.5.T HERMOCOUPLE S OURCE AND M EASURE (8)4.6.I NTERNAL OR E XTERNAL CJC (8)4.7.RTD S OURCE AND M EASURE (8)4.8.P RESET S ETPOINTS (8)4.9.S WITCH T EST M ODE (9)4.10.P RESSURE M EASUREMENT (9)4.10.1.525A S ERIES P RESSURE M ODULES (9)4.10.2.700S ERIES P RESSURE M ODULES (10)PETITIVE COMPARISON (11)5.1.K EY SPECS/FEATURE COMPARISON (11)5.2.P RICE VS.P ERFORMANCE (12)5.3.H OW THE 7526A WINS (12)6.PRODUCT DEMONSTRATION (13)6.1.C URRENT/P RESSURE (I/P)T RANSMITTER (13)6.2.RTD M EASURE (14)7.SUMMARY SPECIFICATIONS (16)SUMMARY SPECIFICATIONS (CONT.) (17)8.ORDERING INFORMATION (18)9.SALES AND MARKETING MATERIALS (19)10.FAQ (20)Sales Training Guide – 7526A Precision Process Calibrator1. ObjectiveGet a quick introduction to the key product features, learn about the target industries andcustomers, see how the 7526A compares with the competition, learn how to perform a step-by-step product demonstration, and learn where to find available marketing materials.2. IntroductionThe Fluke Calibration 7526A, Precision Process Calibrator is a versatile benchtop calibrator designed for calibration of process instrumentation such as temperature and pressure transmitters, RTD and thermocouple readouts, pressure gauges, digital process simulators, data loggers, multimeters, etc. An isolated measurement channel allows the 7526A to simultaneously source and measure either voltage, current or resistance. Two LCD displays allow the user to view both input and output parameters simultaneously.2.1. Key Features at-a-glance∙ Sources and measures DC voltage, DC current, resistance, RTDs and thermocouples ∙ Measures pressure up to 10,000 PSI using Fluke 700 or 525A-P series pressure modules ∙ Measures 4-20 mA loop current∙ Sources 24 Vdc transmitter loop power∙ Tests pressure & thermal switches with unique automated switch-test function ∙ Measures thermistors up to 4 k Ω∙ Stores up to nine programmable setpoints for each input/output parameter ∙ Accepts ITS-90 coefficients for accurate SPRT measurementsCompatible with MET/CAL® Calibration Software2.2. Product PositioningThe 7526A is positioned in terms of price and performance between more expensive, high-end multi-product calibrators like the Fluke 5080A, and less precise, less versatile handheld field calibrators like Fluke process calibrators. It is capable of calibrating most handheld process calibration instrumentation such as loop calibrators, 3.5 digit DMMs, RTD/thermocouple simulators & readouts. It calibrates temperature andpressure transmitters and can calibrate most Documenting Process Calibrators (2:1 TUR relative to the Fluke 754).Fluke Calibration 7080AFluke 726 & 719Fluke Calibration 7526ASample 7526A workloadSales Training Guide – 7526A Precision Process Calibrator2.3. 7526A vs. 525BThe 7526A adds the functions and performance improvements listed in the table below. The 525B will continue to remain available to customers who do not need the additional features of the 7526A. US list price for the 7526A is about 10% higher than the 525B.3. Target CustomersThe 7526A primarily targets process manufacturers whose functions include: maintaining product quality, reducing waste, improving efficiency and conforming to regulatory standards. To accomplish these goals, field instrumentation used to monitor manufacturing processes must be maintained and calibrated regularly. Most field instrumentation such temperature/pressure transmitters, pressure gauges, RTD & thermocouple calibrators and readouts, and DMMs can be calibrated by the 7526A. Below is a summary by industry∙ Process manufacturers: QA Mgr., QC Inspector, Process Eng., Validation Eng., Compliance Mgr. ∙ Manufacturing: Mfg. Eng., Asset Mgr., Test Eng., Production Mgr. ∙ R&D: Design Engineer, Engineering Technician∙3rd party cal labs: Lab Manager, Calibration TechnicianSales Training Guide – 7526A Precision Process Calibrator4. Key features4.1. Features at-a-glance (left side)DC voltage output terminals ∙0 Vdc to 100 Vdc ∙Accuracy: 30 ppm (+3 μV)DC current output terminals ∙0 mA to 100 mA ∙Accuracy: 50 ppmRTD/Ω output terminals (two-wire) ∙ 5 Ω to 4 k Ω,∙Accuracy: ± 0.05 ºC, –200 to 630 ºCFour-wire RTD/Ω input terminals ∙Accuracy: ±0.02 ºC∙Pt 385, 100 Ω, –80 to 100 ºCIsolated pressure module input∙Accepts Fluke 700 & 525A series pressure modules Thermocouple input/output terminal∙Accuracy: ± 0.1 ºC∙–100 to 800 ºC (type K)Sales Training Guide – 7526A Precision Process Calibrator4.2. Features at-a-glance (right side)Isolated input terminalsAllows simultaneous source and measure ∙dc voltage (0 to 100 V)∙dc current measurement (0 to 50 mA) ∙Switch-test input∙24 V dc loop power supplyNumeric and secondary function keysIsolated input controls ∙dc voltage/current ∙Switch test mode ∙24 Vdc power supply ∙250 Ω HART resistor ∙Pressure input modeCursor controlsInput/Output Function Keys∙dc voltage/current ∙TC/RTD ∙Pressure mode ∙Unit type ∙Stby/Operate RS-232 PortGPIB IEEE-488Service port Chassis groundPower line voltage selector and fuse compartmentAC power inlet 120/240 ACPower switchStandard PC interface includes RS-232 and IEEE-488. A USB to serial adapter cable in included as standard.Sales Training Guide – 7526A Precision Process Calibrator4.3. Standby / Operate mode4.4. HART Communications and Loop PowerSTBY/OPR•In Standby mode , changes to the output value are not applied until you select the Operate mode•In Operate mode , each change to the output value is applied immediately•Voltages more than 30 V are not applied. The mode automatically reverts to Standby for safety•In Thermocouple mode, move through the thermocouple types (including millivolts)•In RTD/Ohms mode , move through the RTD types (including ohms)•In Pressure mode , move through the pressure units Select a secondary function from the numeric keypadSales Training Guide – 7526A Precision Process Calibrator4.5. Thermocouple Source and MeasureThe 7526A sources and measures all common thermocouple types and can display basic millivolts from -10.0 to 75.0 mV. See the extended specifications for the accuracy of each thermocouple type of a given temperature range.Thermocouple types include: B, C, E, J, K, L, N, R, S, T, U, XK, and BP.4.6.Internal or External CJCThe 7526A allows the user to select internal or external cold junction compensation forthermocouple temperature measurements. When external compensation is selected, XCJC is shown on the second line of the display. This mode simply disables the internal CJC and the 7526A will no longer monitor room temperature at the junction. When the internal CJC is disabled, an ice bath is used as a temperature source for the cold junction.4.7.RTD Source and MeasureThe 7526A both sources and measures most common types of RTDs and PRTs. The user can store CVD coefficients for up to five different probes and ITS-90 coefficients for one SPRT calibrated from –200 °C to 660 °C.RTD and thermistor types include:•Pt 385 100 Ω, 200 Ω, 500 Ω, 1000 Ω•Pt 3926 100 Ω•Pt 3916 (JIS) 100 Ω•Ni120 120 Ω•Cu 427 (Minco) 10 Ω•YSI 400 thermistorAlthough not listed above, an SPRT with a nominal resistance of 25.5 Ω can be measured when defined using ITS-90 coefficients.4.8.Preset SetpointsNine preset output setpoints can be stored and recalled for each output mode, including:•Voltage•Current•Each thermocouple type, including millivolts•Each of the five RTD definitionsThe setpoints can be recalled individually or can be cycled up and down with a user-defined dwell time at each setpoint. The automatic cycle feature starts at setpoint number 1 and steps to a user-specified end setpoint number. It then goes back down in reverse sequence and cycles through theSales Training Guide – 7526A Precision Process Calibratorsequence again. If the automatic cycle feature is used, the order of the setpoints must be stored in the desired sequence from setpoint 1 to the end setpoint.4.9.Switch Test ModeTo enter the Switch Test mode, the user presses and holds the Volts/mA key (right side of calibrator) for three seconds. After connecting the switch to the calibrator, the user cycles the switch over its range, first in one direction and then in the other. The 7526A will record the measured parameter where the switch changes state and displays the value. After cycling the switch both up and down, the calibrator will automatically display the “deadband,” or the range over which the switch does not actuate.4.10. Pressure MeasurementThe 7526A automatically recognizes either a Fluke 700 or 525A series pressure module whenconnected and automatically selects the appropriate range. Both displays show pressure at the same time but different units of measure can be displayed if desired.4.10.1.525A Series Pressure Modules4.10.2.700 Series Pressure Modulespetitive comparison5.1.Key specs/feature comparison Competitive ComparisonMake/ModelTC meas.accuracy(type K @ 0 C)RTD meas.accurcy(pt 385 @ 0 °C)Switchtest24V loopsupplyDCV,inputaccuracyDCV,outputaccuracyDCI,inputaccuracyDCI, outputrangeDCI, outputaccuracyResistanceinputaccuracyResistanceoutputaccuracyFluke 7526A0.10 °C0.02 °C yes yes 0.005%+ 0.2 uV0.003%+ 1 mV0.005%+ 1uA0 to 100 mA0.005 %+1uA0.004%+ 2 mΩ0.015 ΩFluke 525B0.16 °C0.02 °C no no no 0.003%+ 1 mVno0 to 100 mA0.005 %+1uA0.02 Ω0.015 ΩMartel 3001 0.14 °C0.02 °C no yes 0.005%+ 0.2 uV0.003%+ 1 mV0.005%+ 1uA0 to 100 mA0.005 %+1uA0.004%+ 2 mΩ0.015 ΩAmetek AMC9100.14 °C0.02 °C no yes 0.005%+ 0.2 uV0.003%+ 1 mV0.005%+ 1uA0 to 100 mA0.005 %+1uA0.004%+ 2 mΩ0.015 ΩOmega CL3001 0.14 °C0.02 °C no yes 0.005%+ 0.2 uV0.003%+ 1 mV0.005%+ 1uA0 to 100 mA0.005 %+1uA0.004%+ 2 mΩ0.015 ΩWika Mensor CED7000 0.14 °C0.02 °C no yes0.005%+ 0.2 uV0.003%+ 1 mV0.005%+ 1uA0 to 100 mA0.005 %+1uA0.004%+ 2 mΩ0.015 ΩFluke 7540.3 °C0.07 °C yes no 0.02%+ 50 uV0.01%+ 50 uV0.01%+ 20 uA22 mA0.02%+ 0.007 mA0.1%+ 10Ω0.02%+ 0.02 ΩBeamex MC50.1 °C (no CJC)0.06 °C yes yes0.02%0.02%0.02%0 to 25 mA0.02%0.02%+ 3.5 mΩ0.04%+ 3.5 mΩAOiP PJ 63010.3 °C n/a no no 0.015%+ 500 uV0.015%+ 500 uV0.02%+ 0.6uA0 to 60 mA0.02%+ 0.8uA0.01 %+ 80 mΩ0.01 %+ 100 mΩTongren TD7600 (to be discontinued)0.2 °C0.2 °C no no0.009%0.009%0.010% 1 to 100 mA0.01%= 7uA0.002%+ 30 mΩ0.8 ΩMeatest M5050.1% + 1 °C0.1% + 0.5 °C no no0.1%+ 1 digit0.05%+ 0.1%0.1%+ 1 digit0 to 22 mA0.05%+ 0.1%no0.1%+ 0.5 ΩEctron 11400.08 °C n/a no no0.0025%0.0025%no no no no no5.2. Price vs. Performancea documenting capability, are positioned below the 7526A.5.3. How the 7526A wins6.Product demonstrationWhat you decide to demonstrate will depend largely on the type of demo kit you purchased and the applicable accessories you have on hand. Due to its versatility, the 7526A can be easily demonstrated using common items you likely have in the office such as a 3.5 digit DMM, a temperature indicator or thermometer readout, a handheld pressure pump and pressure module, a common RTD or just about any thermocouple type with a mini-jack termination. Configuring the 7526A is intuitive and the Users Manual includes a section on applications (chapter 4) that illustrates how to connect most DUTs to the 7526A. Before doing any customer demonstration, ensure beforehand that you have all the necessary test leads to interface the DUT and the calibrator. The examples below are just a sampling of what you might consider for a customer demonstration.6.1.Current/Pressure (I/P) TransmitterRequired items include:▪Pressure source▪Pressure transmitter▪Pressure module▪Test leadsIn this example, the 7526A will source a 4-20 mA current to a pressure transmitter. The 7526A measures the corresponding pressure from the pressure module.1.Disconnect test leads from external devices.2.Select current output on the primary display (left side) by pressing Volts/mA to select dc voltageand current mode.3.If dc voltage mode is shown, push the key again to go to dc current mode.e the numeric keypad to type the necessary output value and push ENTER.5.Select pressure input on the isolated display (right side) by pressing the pressure key.6.If necessary, push the pressure key again to cycle through the pressure units until the desiredunit is shown.7.Attach the pressure module to the pressure source. Connect the Lemo connector from thepressure module to the pressure measurement input of the 7526A.8.Connect the transmitter to the calibrator as shown in the illustration.9.Press the STBY/OPR key to source current to the transmitter.10.To verify and calibrate the transmitter, refer to the transmitter documentation.6.2.RTD MeasureRequired items include:Four-wire RTD (e.g. 5615-9-S)In this example, a four-wire RTD (although technically incorrect, also referred to as an SPRT) is connected to the 7526A and ITS-90 coefficients are programmed into the calibrator. Using a Fluke Calibration drywell in this demonstration will allow you to cycle the RTD to different setpointtemperatures.The coefficients A- and B- refer to the A4 and B4 coefficients. These are generated when the SPRT is calibrated at the triple points of argon, mercury, and water. This includes the 83.8058 K (–189 °C) to 273.16 K (0 °C) subrange. Coefficients A, B, and C refer to different coefficients based on which subranges of the SPRT were calibrated. For example, if the 273.15 K (0 °C) to 933.473 K (660 °C) subrange was used, A, B, and C would refer to A7, B7, and C7 whereas if the 273.15 K (0 °C) to 692.67 K (420 °C) subrange was used, A and B would refer to A8 and B8 and C=0.To key in the deviation coefficients for a custom SPRT:1.Select RTD measure mode by pressing TC/RTD to select thermocouple and RTD/Ω mode. Ifthermocouple mode is shown, push the key again to go to RTD/Ω mode. If output mode isshown, push and to select input mode.2.Push TYPE/UNITS until the SPRT type is selected.3.Push ENTER to show the prompt “SET(1)/RECALL(2)”.4.Push to select custom SPRT data entry.5.At the “ENTER MIN TEMP” prompt, key in the minimum temperature limit for the SPRT, andpush ENTER.6.At the "ENTER MAX TEMP" prompt, key in the maximum temperature limit for the SPRT, andpush ENTER.7.At the "ENTER RTPW" prompt, key in the nominal resistance value (RTPW) for the SPRT, andpush ENTER.8.At the "ENTER COEFF A" prompt, key in the first (A) deviation coefficient for the SPRT, and pushENTER. To key in a coefficient that includes an exponent, key in the mantissa, push ENTER and to select the EXP function, key in the exponent and push ENTER.9.When prompted, use the same method to key in the second (B), third (C), fourth (A-), and fifth(B-) deviation coefficients.10.To abort the SPRT entry without stored changes, push TC/RTD.To use the SPRT:1.Select RTD measure mode as described above.2.Push TYPE/UNITS until the SPRT type is selected.3.Push ENTER to show the prompt "SET(1)/RECALL(2)".4.Push to recall the SPRT coefficients.7.Summary specificationsSee Extended Specifications for 90-day and 1-year specifications and specifications for all RTD and thermocouple types. Pressure specifications are determined by the module (see section 4.10.1 and 4.10.2 in this Sales Guide for module accuracy). Listed below are 1-year specifications and limited to more common RTD and thermocouple types. General specifications are included on the following page.Summary Specifications (cont.)8.Ordering informationEach 7526A includes the calibrator, Users Manual (CD ROM), Getting Started Guide, AC mains cord, thermocouple shorting jumper, traceable report of calibration, USB to serial cable adapter.9.Sales and Marketing Materials•The following sales and marketing materials are accessible from the 7526A Launch Page found at: /7526A-LaunchSales Training Guide – 7526A Precision Process CalibratorFluke Corporation Company ConfidentialPage 20 of 2010. FAQFLUKE-71X.FLUKE-71X。

Orion Star A211 Benchtop pH Meter 说明书

Orion Star A211 Benchtop pH Meter 说明书

Power Stirrer USB RS232BNCATC/CONDO/RDOGroundRef.Push down on tab and lift the battery coverOrion Star A211 BenchtopShown when an alarm is set and the alarm value is reached.Indicates the meter is set to be interfaced with a printer or computer viaIndicates the meter is set to be interfaced with a printer or computer via the USB port.Displays the current temperature based on the temperature probe reading orentered temperature value. Shows the origin of the temperature as MAN23Keypad1. Press to power the meter on. When the meter is on, press and quickly release to turn thebacklight on or off or press and hold for about three seconds to power the meter off.3. The f1, f2, and f3 function keys perform a variety of meter operations. The menu-specific operation isf1 in the measurement mode to start a calibration.menu, change the measurement mode, manually log or print a measurement and hold (freeze) a displayed 5Press the f1, f2 and f3 function keys to perform the action shown above each key on the display.Press to turn the meter on.When the meter is on, press and quickly release to turn the display backlight on or off or hold down to turn the meter off.In the measurement mode, press to take a measurement.In the setup, calibration and other menus, press to escape the current menu andreturn to the measurement mode.In the measurement mode, press to enter the setup menu.In the setup, calibration and other menus, press to scroll up through a list of options.In the continuous measurement mode, press to freeze the displayed measurement and press again to unfreeze the measurement.In the setup, calibration and other menus, press to scroll left through a list of options. In the measurement mode, press to change the displayed measurement mode. Options are pH, mV, RmV (relative mV) and ORP.In the setup, calibration and other menus, press to scroll right through a list of options.Press to view the data log and calibration log.Press to start or stop the stirrer probe.Keypad Display InformationpH CalibrationOne to five pH buffers can be used for calibration. Always use fresh pH buffers and select buffers that bracket the sample pH and are one to four pH units apart. Prepare the pH electrode according to the instructions in the electrode use guide. Connect the pH electrode and any other electrodes to be used (ATC probe, stirrer probe, reference electrode) to the meter. Power on the meter and set the measurement mode to pH.1. In the measurement mode, press f1 (cal).2. Rinse the pH electrode and any other electrodes in use with distilled water, blot dry with a lint-free tissueand place into the pH buffer.3. When the electrode and buffer are ready, press f3 (start).4. Wait for the pH value on the meter to stabilize and stop flashing and perform one of the following actions:a. Press f2 (accept) to accept the displayed pH value.b. Press f3 (edit)temperature is shown above the numeric entry screen.ii. Press f2 (done) to exit the numeric entry screen.iii. Press f2 (accept) to accept the entered pH value.5. Press f2 (next)to proceed to the next buffer and repeat steps 2 through 4 or press f3 (cal done) to saveand end the calibration. If five buffers are used, the calibration will save and end once the fifth buffer value is accepted.a. If a one point calibration is performed, press f2 (accept) to accept the displayed slope value or pressf3 (edit) to access the numeric entry screen, enter the slope value and press f2 (accept).6. The meter will display the calibration summary including the average slope. Press f1 (meas) to exportthe data to the calibration log or press f2 (print)to export the data to the calibration log and a printer orcomputer. The meter will automatically proceed to the measurement mode.45Measurementsetup menu.1. Rinse the pH electrode and any other electrodes in use with distilled water, blot dry with a lint-free tissue and place into the sample.2.3. Start the measurement and wait for it to stabilize.a. If the meter is in AUTO-READ theicon stops flashing, record the pH and temperature of the sample. Press a new measurement.b. If the meter is in continuous mode, the meter will immediately start taking a measurement and update the display whenever the measurement changes. Wait for the display to show ready and record the pH and temperature of the sample.c. If the meter is in timed mode, the meter will log measurements at the preselected time interval, regardless of the measurement stability. The meter will update the display whenever themeasurement changes, so the pH and temperature of the sample can be recorded when the display shows ready .4. 5. Remove the electrode from the sample, rinse with distilled water, blot dry and place into the next sample.6. Repeat steps 2 through 5 for all samples.7. When all samples have been measured, store the electrode according to its user guide.Setup MenuNavigating the Setup Menu1.f3 (select) tof3 (select) to select a setup4.a.f3 (select) to set the value.b. To enter a numeric value, use the numeric entry screen.i. Select the value to be entered by pressing f3 (select) or f3 (edit). The numeric entry screen willpopup on the display.of the numeric entry screen.iii. Press5. Press f1 (back)Setup Menu Overview67pH Buffer Group SelectionThe selected buffer group allows for the automatic recognition of certain pH buffers during a pH calibration. The USA buffer group includes pH 1.68, 4.01, 7.00, 10.01 and 12.46 buffers and the DIN buffer group includes pH 1.68, 4.01, 6.86, and 9.18 buffers.1.pH Channel and press f3 (select).Mode and Settings and press f3 (select).Buffer Group and press f3 (select).5USA or DINand press f3 (select).Read Type Selection1.pH Channel and press f3 (select).Mode and Settings and press f3 (select).Read Type and press f3 (select).5Auto , Continuous or Timed and press f3 (select).a. If Timed is selected and the time interval needs to be changed – highlight Timed highlight hours (HH), minutes (MM) or seconds (SS); press f3 (edit) screen; use the numeric entry screen to change the values and press f1 (back) when the time©2011 Thermo Fisher Scientific Inc. All rights reserved. All trademarks are the property of Thermo Fisher Scientific Inc. & its subsidiaries.Water Analysis Instruments www.thermoscientifi/water 68X576501 RevA 0811Viewing the Calibration Log1.Calibration Log and press f2 (accept).pH – Channel and press f2 (select).pH , RmV , ORP or Temperature as the calibration type and 5. The meter will display a list of calibrations for the selected channel and calibration type. The list shows the sequential number of the calibration and the date and time it was saved (07/01/2011 12:45).6.f2 (select). Pressf2 (print)f3 (info) to view the electrode slope between calibration points or press f1 (back) to return to the list of calibrations.Viewing the Data Log1. Data Log and press (accept).3. The meter will display a list of the data points. The list shows the sequential number of the data point and the date and time the data point was saved (07/01/2011 12:45).4.point and press f2 (select). Press f2 (print) data points.5。

LTE基站同步方式不同导致SINR和CQI的路测和话统差异

LTE基站同步方式不同导致SINR和CQI的路测和话统差异

LTE基站同步方式不同导致SINR和CQI的路测和话统差异1问题描述在某国FDD-LTE网络中,客户对CQI路测以及话统统计指标格外关注,要求华为达到相应的指标要求。

通过不同厂家网络对比发现,友商N和Z设备所在的LTE网络SINR值与CQI优良比均优于我司网络。

基于竞争压力,需要尽快了解差异点并快速提升。

2原因分析经过测试了解,我司设备所在的A城市LTE网络RSPR与N设备所在的B城市网络,以及Z设备商所在的C城市,路测测试RSRP相当,但是SINR相差约2dB。

LTE网络内,影响SINR的因素比较多。

如PCI 模3干扰、网络负载、导频污染、外部干扰等。

通过RF优化调整,排除了PCI 模3干扰、导频污染。

通过清频测试,排除了外部干扰因素。

并且A,B,C三个城市网络负荷都是轻载网络,尚未规模放号商用,网络负载不是主要原因。

除此之外,是否存在其它原因,导致在RSRP相当的情况下,SINR和CQI差异较大。

在客户的配合下,了解了N和Z设备的一些配置信息。

其中得知N以及Z的基站采用频率同步方式,而华为基于CL优化切换以及EMBMS考虑,采用严苛的时间同步方式。

这是一个可能的原因,但是否是导致SINR和CQI差异的重要原因,无法证实。

为了量化评估时间同步方式和频率同步方式对网络性能的影响,笔者在华为设备所在的A市区选择一个区域进行对比测试。

3基站同步方式理论研究3.1时间同步时间信号是带有年月日时分秒时间信息的时钟信号。

目前时间信息业界使用UTC (Universal Time Coordinated)时间信息。

UTC时间是世界标准时间。

时间同步又称时刻同步,是指绝对时间的同步。

一般的,全网时间同步是指全网设备时间信息和UTC时间同步,即时间信号的起始时刻和UTC时间保持一致。

如图所示,信号A和B是时间同步,信号C、D和A不是时间同步。

时间同步又称为相位同步。

相位时间(Phase Time)指时钟信号与理想信号在对应的有效瞬间(一般指上升沿或者下降沿)的延迟时间,简称为“相位”。

LAQUAtwin系列水质测试仪操作指南说明书

LAQUAtwin系列水质测试仪操作指南说明书

CODE: GZ0000472457Instruction Manual (Operation)COMPACT WATER QUALITY METER LAQUAtwin-EC-11, LAQUAtwin-EC-22, LAQUAtwin-EC-33SpecificationsItems in packageConsumable parts sold separatelyPart Namesotherwise specified.Initial SetupAttaching/detaching the sensorInserting/removing batteriesElectrode conditioningNote●Before using the sensor for the first time or after several days of disuse, perform electrode conditioning.●Perform calibration after electrode condition-ing.1.Place some drops of the conditioning solu-tion into the measurement cell.2.Wait 10 min before use.There is no need to switch the meter ON.3.Clean the flat sensor with running water.Model LAQUAtwin-EC-112233Minimum sample volume 0.12 mL √√√Range and resolution (valid digits)Conductivity0 to 199 μS/cm:200 to 1999 μS/cm:2.00 to 19.99 mS/cm: 1 μS/cm 1 μS/cm 0.01 mS/cm √√√20.0 to 199.9 mS/cm:0.1 mS/cm √√TDS 0.0 to 99.9 ppm:100 to 999 ppm:1000 to 9990 ppm:0.1 ppm1 ppm 10 ppm√Calibration Up to 2 points√Up to 3 points√√Accuracy *1±2% full scale (for each range)√±2% full scale (0 to 19.99 mS/cm)±5% full scale (20.0 to 199.9 mS/cm)√√Temperature display 0 to 50.0︒C √√Target Electrical conductivity Measurement principle 2 electrode bipolar AC Titanium coated with Platinum black Display Custom (monochrome) digital LCD withbacklightOperating environment 5 to 40︒C, 85% or less relative humidity (no condensation)Power CR2032 batteries (⨯2)Battery life Approx. 400 h continuous operation*2Outer dimen-sions/mass 164 ⨯ 29 ⨯ 20 mm (excluding projections), Approx. 45 g (excluding batteries)Main functionTemperature compensation (2%/︒C fixed), waterproof *3, auto stable/auto hold, auto-matic power OFF*1Accuracy is the closeness of agreement betweenthe measured value and actual value of the stan-dard solution in the measurement of the same standard solution as the one used for the calibra-tion. Temperature during the calibration and mea-surement is the same. The error of standard solutions and rounding error (±1 digit) are not included.*2The life period if the meter is used in the backlightoff mode. If the backlight is used, battery life will shorten.*3IP67: no failure when immersed in water at adepth or 1 meter for 30 minutes.Please note that the meter can not be used underwater ItemsQuantitySensor S0701Meter1Storage case 1BatteriesCR20322Standard solution1413 μS/cm 112.88 mS/cm1Pipette1Conditioning solution1Instruction manual (Operation)1Instruction manual (Before use)1Items Specifications Part No.Sensor S070, COND3200459672Standard solution514-22, 1413 μS/cm 3999960110514-23, 12.88 mS/cm 3999960111Conditioning solution514-203999960114Attaching the sensor 1.Power OFF the meter.2.Confirm that the waterproofing gasket is clean and undamaged.Detaching the sensor 1.Power OFF the meter.2.Lift the sensor tongue tip and slide the sensor a little away from the meter.3.Pull out the sensor all the way from the meter. Inserting the batteries 1.Power OFF the meter.2.Slide both batteries into the battery case as shown.Be sure to use two CR2032 batteries, and put them with the plus sides (+) upwards.Removing the batteries 1.Power OFF the meter.e a ball-point pen or other tool to pry the batter-ies out from the clips as shown.Basic OperationCalibrationCalibration is required before measurement.Use standard solution within the measurement range in the specifications.Tip●Calibration values are saved even if the meter is switched OFF.●Calibration value is rewritten if calibration is repeated using the same standard solution.MeasurementNote●If a measured value is out of the specified measurement range, "Or" is displayed for upper range and "Ur" is displayed for under range.●Ambient air may cause the measured values to fluctuate. To reduce environmental interfer-ence, close the protection cover.●When you have a problem with the calibration or measurement, refer to frequently asked questions.Measurement display changeMeasurement display change is available on LAQUAtwin-EC-22 and LAQUAtwin-EC-33.The display mode switches as follows by press-ing the MEAS switch in the AS mode.●LAQUAtwin-EC-22:Between conductivity and temperature alter-nately●LAQUAtwin-pH-33:Among conductivity, TDS, and temperaturePower ON1.Press and hold the ON/OFF switch.The power is switched ON,and the meter model num-ber is displayed on the LCD. Power OFF1.Press and hold the ON/OFF switch.The power is switched OFF.Precaution on sample settingPlace an appropriate amount of a sample or stan-dard solution into the measurement cell without trapping bubbles inside. If not, the measurement may be inaccurate.Calibration pointsThe number of calibration points is dependent on the meter model.●LAQUAtwin-EC-11:Up to two-point calibration at 1413 μS/cm and 12.88 mS/cm●LAQUAtwin-EC-22 and LAQUAtwin-EC-33:Up to three-point calibration at 1413 μS/cm,12.88 mS/cm, and 111.8 mS/cmMulti-point calibration1.Open the protection cover and place some drops of the standard solution into the measurement cell.Rinsing the sensor with the standard solution beforehand will provide a more accurate cali-bration as it will reduce sample crossover con-tamination.2.Close the protection cover and press theplayed.The calibration value at 25 C is displayed for 1s and the display returns to the measurement mode automatically.3.Open the protection cover and remove the standard solution. Then remove moisture on the sensor by gently dabbing with a soft tissue.This completes the 1st point calibration.4.To perform 2nd point calibration, repeat steps 1. to 3. Calibration error If blinks and Er4 (error dis-play) appears, the calibration has failed.Perform electrode conditioning.Check that the correct standard solution is used,and repeat calibration after cleaning the sensor.If the calibration repeatedly fails when using the correct standard solution(s), the sensor may have deteriorated. Replace the sensor with newone.Sample setting1.Open the protection cover and put some drops of sample into the measurement cell.2.Close the protection cover. Measurement modeThe auto stable (AS) mode and the auto hold (AH) mode can be selected. Refer to " Mea-surement mode change" (page 4) for the opera-tion to set the measurement mode. Auto stable (AS) mode1.Confirm that the meter is in the measure-ment mode, and place a sample on the sen-sor.locked.2.appears.Auto hold (AH) modeand will not change until the MEAS switch is pressed for the next measurement.1.Confirm that the meter is in the measure-ment mode, and place a sample on the sen-sor.2.Press the MEAS switch.The auto hold function is acti-vated.blinks until the mea-sured value has stabilized.When the measured value is stable, stops blinking and the displayed value is locked with3.Document the displayed value.4.Press the MEAS switch.disappears.Be sure to perform this step before starting the next measurement. Or, you may mistake the displayed hold value for the next measured value.MaintenanceAppendixFrequently asked questions Storage1.Clean the sensor with tap water.2.Dab gently with soft tissue or cloth to remove moisture on the sensor and meter.Especially be sure to treat the sensor gently to prevent damaging it.3.Close the protection cover before storing the meter. Temperature sensor adjustmentTemperature sensor adjustment is available on LAQUAtwin-EC-22 and LAQUAtwin-EC-33.To perform accurate measurement with correc-tion for temperature effects, follow the steps below. Normally this is not necessary.1.Ready a reference thermometer, and allow the meter and reference thermometer to reach to room temperature.2.Set the display mode to temperature refer-ring to " Measurement display change"(page 2).3.Press the CAL switch.The meter displays the setting screen for tar-get temperature.4.Press the MEAS switch to adjust the dis-played temperature on the meter to match the temperature indicated by the reference thermometer.Pressing the MEAS switch increases the dis-played temperature. After the displayed tem-perature reaches 40°C, it returns to 5°C.5.Press the CAL switch again to apply the displayed value to the adjustment.The adjustment starts. The adjusted value blinks with and displayed.After the adjustment is complete, the adjusted value stops blinking with MEAS and dis-played.If Er4 (error display) appears, the adjustment has failed. Retry the above steps increasing the time spent on the step 1.If the adjustment repeatedly fails, the sensor may have deteriorated. Replace the sensor with new one.Initializing calibration dataInitialize calibration in the following cases.●To delete the calibration data●If the number of points for the last calibration is uncertain.●After the sensor is replaced.1.Press and hold the CAL and ON/OFF switches for over 3 seconds when the meter is switched OFF to Initialize calibra-tion.After a moment of all segment indication, the software version is displayed. And then, the display changes as shown right.2.Press the CAL switch.All calibration data is reset.When the initialization of cali-bration data is complete, End appears.The meter automatically switches OFF. Initializing the settingsAll setup choices are erased. The meter is reset to the factory default values.1.Press and hold the MEAS, CAL and ON/OFF switches for over 3 seconds when the meter is switched OFF to enter the initial-ization.After a moment of all segment indication, the software version is displayed. And then, the display changes as shown right.2.Press the CAL switch.All calibration data is reset.When the initialization of set-tings is complete, End appears.The meter automatically switches OFF.Er4 is dis-played during the calibra-tionPlease note that if you press the CAL switch in mV or temperature display mode, Er4 is displayed. This is because there is no calibration facility available for these modes.Er1 is dis-played soon power ON.The internal IC in the meter may be defective. Perform meter initializa-tion.If Er1 is still displayed after the initial-ization, the internal IC in the meter is defective. Replace the meter with a new one (the meter cannot be repaired).Er2 is dis-played right after power ON.The internal IC in the meter is defec-tive. Replace the meter with a new one (the meter cannot be repaired).Er3 is dis-played right after power ON.The internal IC in the meter is defec-tive. Replace the meter with a new one (the meter cannot be repaired).Question AnswerSetup ModeThe setup mode allows the user to customize the meter to his specific needs.To enter the setup mode, press and hold the MEAS and ON/OFF switches for over 3 seconds when the meter is switched OFF. All the LCD segments appear and then the meter enters the setup mode.Tip●To have the changes apply, you need to go through the entire steps from “Setup mode entry” to “Setup completion” shown below. To leave a setting as it is, just press CAL switch in the setting.●To exit the setup mode with no change of settings, press the ON/OFF switch earlier than pressing CAL switch in the last step but one, or the “Backlight setting” step.Setup mode entryUnit settingThe display units can be changed.TDS method setting (Only LAQUAtwin-EC-33)The TDS method can be selected from the following options only on LAQUAtwin-EC-33.●FACt: KCl with factor adjustable from 0.4 to 1.0 (default 0.5)●442: Myron L 442 non-linear standard curve●En: European environmental standard non-linear curve ●NACL: Non linear salinity curveThis step is bypassed on LAQUAtwin-EC-11 and LAQUAtwin-EC-22.Factor setting (Only LAQUAtwin-EC-33 with the TDS method set to FACt)This step is bypassed on LAQUAtwin-EC-11 and LAQUAtwin-EC-22, and when the TDS method is set to 442, En, or NACL on LAQUAtwin-EC-33.*The setting range is from F0.4 to F1.0.In this setting, pressing the MEAS switchincreases the displayed value. After the dis-played value reaches F1.0, it returns toF0.4. Measurement mode changeThe measurement mode can be switched.NoteThe AH (auto hold) mode is applied only to conductivity measurement.* Measurement display change is available in the AS mode. Refer to " Measurement display change" (page 2).Backlight settingThe backlight can be switched to ON or OFF.Setup completion31, Miyanonishi-cho, Kisshoin Minami-ku, Kyoto,。

Measurement Studio Measurement Computing Edition 安

Measurement Studio Measurement Computing Edition 安

RELEASE NOTESMeasurement StudioMeasurement Computing Edition These release notes supplement the Measurement Studio MeasurementComputing Edition User Manual . Refer to this document for informationabout new features and functionality, specific updates to thedocumentation, and resources in Measurement Studio. These releasenotes include information about Measurement Studio support for VisualStudio 2008, Visual Studio 2005, and Visual Studio .NET 2003.For installation instructions, installation requirements, deploymentinformation, a list of fixed bugs, and known issues, refer to theMeasurement Studio Readme . There is a different Measurement StudioReadme for each supported version of Visual Studio. The MeasurementStudio Readme files are available in the root folder of the installation CDand are linked from the Autorun menu. After installing MeasurementStudio 8.6, select Start»All Programs»National Instruments»<Measurement Studio>»Readme to access the Measurement Studio 8.6Readme .Note There are separate Start menu items for each version of Visual Studio support.For a complete introduction to Measurement Studio and to learn aboutMeasurement Studio concepts, controls, and features, refer to theMeasurement Studio Measurement Computing Edition User Manual .Select Start»All Programs»National Instruments»<MeasurementStudio>»Measurement Studio User Manual to access the MeasurementStudio Measurement Computing Edition User Manual .TipThe Documentation Updates section of these Release Notes details changes to theMeasurement Studio Computing Edition User Manual content.™™What’s New in Measurement Studio 8.6What’s New in Measurement Studio 8.6 Support for Visual Studio 2008New features in Measurement Studio 8.6 support for Visual Studio 2008include the following. Refer to the New Features in MeasurementStudio8.6 section for more information.•Technical Data Management Streaming (TDMS) .NET Support•Mouse Cursor Customizability•Additional new features–Instrument Driver Wizard HTML Tag Removal–New Properties for AutoSpacing in Graph Axes–Bug FixesWhat’s New in Measurement Studio 8.6 Support for Visual Studio 2005New features in Measurement Studio 8.6 support for Visual Studio 2005include the following. Refer to the New Features in MeasurementStudio8.6 section for more information.•Technical Data Management Streaming (TDMS) .NET Support•Mouse Cursor Customizability•Additional new features–Instrument Driver Wizard HTML Tag Removal–New Properties for AutoSpacing in Graph Axes–Bug FixesWhat’s New in Measurement Studio 8.6 Support for Visual Studio.NET2003Measurement Studio support for Visual Studio .NET 2003 is considered alegacy product. The version number for Measurement Studio support forVisual Studio .NET 2003 is 8.1.2.Measurement Computing Edition Release New Features in Measurement Studio8.6Technical Data Management Streaming (TDMS) .NET SupportTechnical Data Management Streaming (TDMS) is a file format basedon the National Instruments TDM data model used to stream data to disk.You can use the TDMS .NET class library to describe, store, and readmeasurement data that is optimized for high-speed data streaming and postprocessing. Additionally, you can use the TDMS .NET class library tocreate files you can use in LabVIEW, LabWindows/CVI, DIAdem, andthird-party industry tools, and files created by these applications can beused by the TDMS .NET class library.For more information, refer to Key Measurement Studio TDMS .NETLibrary Features in the NI Measurement Studio Help.TDM Excel Add-InYou can use the TDM Excel Add-In to load NI .tdm and .tdms files intoMicrosoft Excel. Use the toolbar from within Excel to choose whichproperties are loaded into Excel at the file, group, and channel levels,including custom properties.Refer to NI Developer Zone, , for more information aboutthe TDM Excel Add-In.Mouse Cursor CustomizabilityDifferent cursor images represent different interactive operations that anend user can perform on a control. For example, when editing or selectingtext, you typically display a System.Windows.Input.Cursors.IBeam cursor, and for zooming on a graph, you typically display amagnifying lens. Now you can customize mouse cursors, at design time orprogrammatically, for different interactions with the Measurement StudioWindows Forms and Web Forms controls.Additional New FeaturesMeasurement Studio 8.6 also introduces the following new features:•Instrument Driver Wizard HTML Tag Removal—When theInstrument Driver Wizard creates .NET entry points, the InstrumentDriver Wizard extracts the specific documentation for each memberfrom the function panel (.fp) file. This documentation can containHTML entities that make it difficult to read the documentation. Withthis update, the Instrument Driver Wizard includes an option to removethe HTML tags from the documentation, making the documentationeasier to read.© National Instruments Corporation3Measurement Computing Edition Release Notes•New Properties for AutoSpacing in Graph Axes—You can use theAutoSpacingMajorInterval and AutoSpacingMinorIntervalproperties to return the value of the interval used when plotting withAutoSpacing.•Bug Fixes—Measurement Studio 8.6 includes many fixes forpreviously reported bugs. Refer to the fixed bug chart in theMeasurement Studio 8.6 Readme for more information. Select Start»All Programs»National Instruments»<Measurement Studio>»Readme to access the Measurement Studio 8.6 Readme. Documentation UpdatesThe following sections describe changes to the printed MeasurementStudio documentation for Measurement Studio 8.5. These changes will beincorporated into future revisions of the Measurement Studio ComputingEdition User Manual.Chapter 1 UpdatesPage Text Change1-3Following Step 4 of the Installation Instructions, insert the following note:Note: The Measurement Studio installation process includes a command window that opens and closes on your desktop. Closing this windowprematurely causes Measurement Studio integration features to fail to beconfigured properly. For example, Measurement Studio controls do not appear inthe Toolbox, or the Measurement Studio New Project Wizards are not available. Measurement Computing Edition Release Chapter 2 UpdatesPage Text Change2-1Insert a new bullet before the User Interface bullet:•Technical Data Management Streaming (TDMS)2-9Insert the following before the User Interface heading:Technical Data Management Streaming (TDMS) .NET SupportTechnical Data Management Streaming (TDMS) is a file format based on theNational Instruments TDM data model used to stream data to disk. You can usethe TDMS .NET class library to describe, store, and read measurement data thatis optimized for high-speed data streaming and post processing. Additionally, youcan use the TDMS .NET class library to create files that you can use in LabVIEW,CVI, and DIAdem, and files created by these applications can be used by theTDMS .NET class library. You can use the Measurement Studio TDMS .NETclass library to perform the following operations:•Read and write array data in a structured format from and to a .tdms file.•Read and write analog waveform data or digital waveform data, includingtiming information, from and to a .tdms file.•Using the TdmsProperty class, you can create custom properties for eachlevel of the hierarchy by defining a name, data type, and value.For more information, refer to the Key Measurement Studio TDMS .NET LibraryFeatures topic in the NI Measurement Studio Help.TDM Excel Add-InYou can use the TDM Excel Add-In to load NI .tdm and .tdms files intoMicrosoft Excel. Use the toolbar from within Excel to choose which propertiesare loaded into Excel at the file, group, and channel levels, including customproperties.To uninstall the TDM Excel Add-In, select Start»Control Panel»Add orRemove Programs, select National Instruments Software from the list, andclick the Change/Remove button. Then select NI TDM Excel Add-in from thelist, and click the Remove button.Refer to NI Developer Zone, , for more information about theTDM Excel Add-In.© National Instruments Corporation5Measurement Computing Edition Release NotesMeasurement Computing Edition Release Notes Learning Measurement StudioAs you work with Measurement Studio, you might need to consult additional resources. For detailed Measurement Studio help, including function reference and in-depth documentation on developing withMeasurement Studio, refer to the NI Measurement Studio Help within the Visual Studio environment. The NI Measurement Studio Help is fully integrated with the Visual Studio help. You must have Visual Studio installed to view the online help, and you must have the Microsoft .NET Framework SDK 1.1 for Visual Studio .NET 2003, the Microsoft .NET Framework SDK 2.0 for Visual Studio 2005, or the Microsoft .NET Framework SDK 3.5 for Visual Studio 2008 installed in order for links from Measurement Studio help topics to .NET Framework help topics to work. You can launch the NI Measurement Studio Help in the following ways:•From the Windows Start menu, select Start»All Programs»NationalInstruments»<Measurement Studio>»Measurement StudioDocumentation . The help launches in a stand-alone help viewer.•From Visual Studio, select Help»Contents to view the Visual Studio table of contents. The NI Measurement Studio Help is listed in the table of contents.•From Visual Studio, select Measurement Studio»NI Measurement Studio Help . The help launches within the application.2-13, 2-17,2-29, 2-34Insert the following bullet at the end of the Cursor Operations section:•Create custom mouse cursors programmatically or at design time using themouse cursor style editor.2-15, 2-32Insert the following bullet at the end of the Additional Operations section:•Create custom mouse cursors programmatically or at design time using the mouse cursor style editor.PageText ChangeThe following resources also are available to provide you with informationabout Measurement Studio.•Getting Started information—Refer to the Measurement Studio CoreOverview topic and the Getting Started with the Measurement StudioClass Libraries section in the NI Measurement Studio Help for anintroduction to Measurement Studio and for walkthroughs that guideyou step-by-step in learning how to develop Measurement Studioapplications. For an introduction to Measurement Studio resources,refer to the Using the Measurement Studio Help topic in theNI Measurement Studio Help.•Examples—Measurement Studio installs examples organized by classlibrary, depending on the component, the version of Visual Studio orthe .NET Framework that the example supports, the version ofMeasurement Studio installed on the system, and the operating system.For more information on example locations, refer to the Where To FindExamples topic in the NI Measurement Studio Help.•Measurement Computing Technical Support—Refer to Appendix A,Contacting Measurement Computing Corp., in the MeasurementStudio Measurement Computing Edition User Manual for moreinformation.You can find the User Manual at Start»All Programs»National Instruments»<Measurement Studio>»MeasurementStudio Documentation»User Manual.•Measurement Studio Measurement Computing Edition Web site,/mstudio—Contains Measurement Studio news,support, and downloads.•NI Developer Zone, —Provides access to onlineexample programs, tutorials, technical news, and Measurement Studiodiscussion forums where you can participate in discussion forums for.NET Languages.•Review the information from the Microsoft Web site on using VisualStudio.National Instruments, NI, , and LabVIEW are trademarks of National Instruments Corporation.Refer to the Terms of Use section on /legal for more information about NationalInstruments trademarks. Other product and company names mentioned herein are trademarks or tradenames of their respective companies. For patents covering National Instruments products/technology,refer to the appropriate location: Help»Patents in your software, the patents.txt file on yourmedia, or the National Instruments Patent Notice at /patents.© 2006–2008 National Instruments Corporation. All rights reserved.372126E-01Nov08。

是德 N9340B 手持式频谱分析仪 用户手册说明书

是德 N9340B 手持式频谱分析仪 用户手册说明书

是德N9340B手持式频谱分析仪请注意:安捷伦电子测量仪器部已经转为是德科技有限公司。

关于此方面详细信息,请访问用户手册Notices© Keysight Technologies, Inc. 2008-2014No part of this manual may be reproduced in any form or by any means (including electronic storage and retrieval or translation into a foreign language) without prior agreement and written consent from Keysight Technologies, Inc. as governed by United States and international copyright laws. Trademark AcknowledgmentsManual Part NumberN9340-90007EditionEdition 2, July 2014Printed in ChinaPublished by:Keysight TechnologiesNo 116 Tianfu 4th street Chiengdu, 610041 C hina WarrantyTHE MATERIAL CONTAINED IN THIS DOCUMENT IS PROVIDED “AS IS,” AND IS SUBJECT TO BEING CHANGED, WITHOUT NOTICE, IN FUTURE EDITIONS. FURTHER, TO THE MAXIMUM EXTENT PERMITTED BY APPLICABLE LAW, KEYSIGHT DISCLAIMS ALL WARRANTIES, EITHER EXPRESS OR IMPLIED WITH REGARD TO THIS MANUAL AND ANY INFORMATION CONTAINED HEREIN, INCLUDING BUT NOT LIMITED TO THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. KEYSIGHT SHALL NOT BE LIABLE FOR ERRORS OR FOR INCIDENTAL OR CONSEQUENTIAL DAMAGES IN CONNECTION WITH THE FURNISHING, USE, OR PERFORMANCE OF THIS DOCUMENT OR ANY INFORMATION CONTAINED HEREIN. SHOULD KEYSIGHT AND THE USER HAVE A SEPARATE WRITTEN AGREEMENT WITH WARRANTY TERMS COVERING THE MATERIAL IN THISDOCUMENT THAT CONFLICT WITHTHESE TERMS, THE WARRANTYTERMS IN THE SEPARATEAGREEMENT WILL CONTROL.Technology LicensesThe hardware and/or softwaredescribed in this document arefurnished under a license and may beused or copied only in accordancewith the terms of such license.U.S. Government RightsThe Software is “commercialcomputer software,” as definedby Federal Acquisition Regulation(“FAR”) 2.101. Pursuant to FAR12.212 and 27.405-3 andDepartment of Defense FARSupplement (“DFARS”) 227.7202,the U.S. government acquirescommercial computer softwareunder the same terms by whichthe software is customarilyprovided to the public.Accordingly, Keysight providesthe Software to U.S. governmentcustomers under its standardcommercial license, which isembodied in its End User LicenseAgreement (EULA), a copy ofwhich can be found at/find/sweulaThe license set forth in the EULArepresents the exclusive authorityby which the U.S. governmentmay use, modify, distribute, ordisclose the Software. The EULAand the license set forth therein,does not require or permit,among other things, thatKeysight: (1) Furnish technicalinformation related tocommercial computer softwareor commercial computersoftware documentation that isnot customarily provided to thepublic; or (2) Relinquish to, orotherwise provide, thegovernment rights in excess ofthese rights customarily providedto the public to use, modify,reproduce, release, perform,display, or disclose commercialcomputer software orcommercial computer softwaredocumentation. No additionalgovernment requirementsbeyond those set forth in theEULA shall apply, except to theextent that those terms, rights, orlicenses are explicitly requiredfrom all providers of commercialcomputer software pursuant tothe FAR and the DFARS and areset forth specifically in writingelsewhere in the EULA. Keysightshall be under no obligation toupdate, revise or otherwisemodify the Software. Withrespect to any technical data asdefined by FAR 2.101, pursuantto FAR 12.211 and 27.404.2 andDFARS 227.7102, the U.S.government acquires no greaterthan Limited Rights as defined inFAR 27.401 or DFAR 227.7103-5(c), as applicable in any technicaldata.Safety NoticesA CAUTION notice denotes a hazard. Itcalls attention to an operatingprocedure, practice, or the like that,if not correctly performed or adheredto, could result in damage to theproduct or loss of important data. Donot proceed beyond a CAUTIONnotice until the indicated conditionsare fully understood and met.A WARNING notice denotes a hazard.It calls attention to an operatingprocedure, practice, or the like that,if not correctly performed or adheredto, could result in personal injury ordeath. Do not proceed beyond aWARNING notice until the indicatedconditions are fully understood andmet.目录1N9340B简介介绍 2前端面板概述 4顶部面板概述 5显示屏标注 62使用指南检查货品包装和装箱清单 10电源要求 11交流电源线规格 12安全须知 13安装电池 16使用提示 18准备工作 19进行基本测量 20联系是德科技 243系统设置调节视觉效果和声音 26系统设置 27文件 29系统信息 32错误信息 33时基校准 34默认出厂设置 354开始测量测量多个信号 38测量低电平信号 43改善频率分辨率和精确度 48测量信号失真 49三阶交调失真 52一键测量 54脉冲响应传输测量 57测量低通滤波器阻带衰减 59回波校准测量 61通过反射校准环路测量回波损耗 63 5按键说明幅度 66带宽/扫宽 68确认 73退出/清除 74频率 75标记 76测量 81扫宽 82轨迹 83极限 876SCPI 参考命令SCPI 语言基础 90通用命令 93CALCulate子系统 94 DEMOdulation子系统 101DISPlay子系统 102INITiate子系统 105INSTrument子系统 106MEASure子系统 107SENSe子系统 110 SYSTem 子系统 119 TRACe 子系统 120 TGENerator子系统 121 TRIGer 子系统 125 UNIT 子系统 1267错误信息错误信息表 1288按键结构图AMPTD 132BW/SWP 133FREQ 134MARKER 135MEAS 136SPAN 137SYS 138TRACE 139LIMIT 140本手册约定1用花括号{}表示软键,例如{开始}表示开始软键。

欧洲药典7.5版

欧洲药典7.5版
EUROPEAN PHARMACOPOEIA 7.5
INDEX
To aid users the index includes a reference to the supplement in which the latest version of a text can be found. For example : Amikacin sulfate...............................................7.5-4579 means the monograph Amikacin sulfate can be found on page 4579 of Supplement 7.5. Note that where no reference to a supplement is made, the text can be found in the principal volume.
English index ........................................................................ 4707
Latin index ................................................................................. 4739
EUROPEAN PHARMACOPபைடு நூலகம்EIA 7.5
Index
Numerics 1. General notices ................................................................... 7.5-4453 2.1.1. Droppers...................

Testo 315-3 CO CO2 测量仪使用手册说明书

Testo 315-3 CO CO2 测量仪使用手册说明书

testo 315-3 · CO/CO2 measuring instrument Instruction manual21 Contents1Contents1Contents (3)2Safety and the environment (4)2.1.About this document (4)2.2.Ensure safety (4)2.3.Protecting the environment (5)3Specifications (6)e (6)3.2.Technical data (6)3.2.1.Measurement ranges and resolution (6)3.2.2.Accuracy and response time (6)3.2.3.Other instrument data (7)3.2.4.Bluetooth® module (option) (7)4Product description (9)4.1.Overview (9)4.1.1.Control elements and connections (9)4.1.2.Display (12)4.2.Basic properties (13)5First steps (14)6Using the product (15)6.1.Performing settings (15)6.2.Measuring (16)6.3.Printing the measurement results (17)6.4.Bluetooth® and IrDA data transfer (18)7Maintaining the product (18)8Tips and assistance (19)8.1.Questions and answers (19)8.2.Calibration interval (21)8.3.Accessories and spare parts (21)32 Safety and the environment42Safety and the environment2.1.About this documentUse> Please read this documentation through carefully andfamiliarize yourself with the product before putting it to use. Pay particular attention to the safety instructions and warning advice in order to prevent injuries and damage to the products. > Keep this document to hand so that you can refer to it whennecessary.> Hand this documentation on to any subsequent users of theproduct.Symbols and writing standardsRepresen-tationExplanationNote: Basic or further information.1. ...2. ...Action: more steps, the sequence must be followed. > ...Action: a step or an optional step. - ...Result of an action.MenuElements of the instrument, the instrument displays or the program interface.[OK]Control keys of the instrument or buttons of the program interface. ✓ ...Handling prerequisite2.2.Ensure safety> Only operate the product properly, for its intended purpose andwithin the parameters specified in the technical data. Do not use any force. > Dangers may also arise from the systems being measured orthe measuring environment: Note the safety regulations valid in your area when performing the measurements.2 Safety and the environment5> Do not perform contact measurements on non-insulated, liveparts.> Do not store the product together with solvents. Do not use anydesiccants. These substances may impair the operation and reliability of the instrument.> Carry out only the maintenance and repair work on thisinstrument that is described in the documentation. Follow the prescribed steps exactly. Use only original spare parts from Testo.> Temperatures given on probes/sensors relate only to themeasuring range of the sensors. Do not expose handles and feed lines to any temperatures in excess of 70 °C unless they are expressly permitted for higher temperatures.For products with Bluetooth® (optional)Changes or modifications that have been made without the explicit consent of the responsible approval authority, may cause the retraction of the type approval. Data transfer may be disturbed by equipment that uses the same ISM band, e.g. WLAN, microwave ovens, ZigBee.The use of radio communication links is not permitted in aeroplanes and hospitals, among others. For this reason the following points must be ensured before entering:The data transfer function must not be active.2.3.Protecting the environment> Dispose of faulty rechargeable batteries/spent batteries inaccordance with the valid legal specifications. > At the end of its useful life, send the product to the separatecollection for electric and electronic devices (observe local regulations) or return the product to Testo for disposal.3 Specifications63Specifications3.1.UseThe testo 315-3 is a convenient and robust measuring instrument for recording ambient CO/CO 2 values. It is used to avoid risks. Fields of application are, for example- Heating systems with operation dependent on ambient air - Ventilation systems- Storage in the food and drinks industryThe Bluetooth ® option may only be operated in countries in which it is type approved.3.2.Technical data3.2.1.Measurement ranges and resolutionMeasurement parameter Measuring range Resolution CO 0 to 100 ppm 0.5 ppm CO 20 to 10,000 ppm 10 ppm Humidity115 to 95 %RH0.1 %RH Temperature 1 -10 to +60 °C/14 to 140 °F0.1 °C/°F3.2.2. Accuracy and response timeMeasurement parameter Accuracy Response time CO± 3 ppm to 20 ppm ± 5 ppm from 20 ppm< 120 s (t90)1 Only when the humidity/temperature sensor is connected (optional)3 Specifications7Measurement parameter Accuracy Response time CO 2±300 ppm(0 to 4,000 ppm) ±8 % of meas. val. (4,000 to 6,000 ppm) ±500 ppm.(6,000 to 10,000 ppm)< 120 s (t90)Humidity 1±2.5 %RH (5 to 95 %RH)Temperature 1 ±0.5 °C (+ 1 digit)3.2.3. Other instrument dataFeatureValuesStorage/transportation conditions -20 to 60 °C/-4 to 140°F 0 to 95 %RHOperating conditions 0 to 40 °C/32 to 104 °F/0 to 95 %RH Weight incl. holster Approx. 382 g Housing materialABSDimensions (L x W x H)190 x 65 x 40 mm Protection class IP 40 acc. to EN 60529Operating time Min 10 h measuring time (at20 °C/68 °F), mains operation possible DisplaySegment display Supported printer 0554 0549/0554 0547 License acc. to EN 50543WarrantyInstrument: 24 months Battery: 12 months CO sensor: 12 months CO 2 sensor: 12 monthsWarranty conditions: see website /warranty EC Directive2014/30/EC3.2.4. Bluetooth ® module (option)• Bluetooth ® type: BlueNiceCom IV• Bluetooth ® product note: BNC4_HW2x_SW2xx3 Specifications8• Bluetooth® identification: B013784• Bluetooth® company: 10274• Coverage: < 10 mCertificationBelgium (BE), Bulgaria (BG), Denmark (DK), Germany (DE), Estonia (EE), Finland (FI), France (FR), Greece (GR), Ireland (IE), Italy (IT), Latvia (LV), Lithuania (LT), Luxembourg (LU), Malta (MT), Netherlands (NL), Austria (AT), Poland (PL), Portugal (PT), Romania (RO), Sweden (SE), Slovakia (SK), Slovenia (SI), Spain (ES), Czech Republic (CZ), Hungary (HU), United Kingdom (GB), Republic of Cyprus (CY).Iceland (IS), Liechtenstein (LI), Norway (NO) and Switzerland (CH).Turkey (TR), El Salvador (SV), Columbia (CO)4 Product description94Product description4.1.Overview4.1.1.Control elements and connections1 Connection for temperature/humidity module (optional)2 IR interface, record printer 3CO sensor module4CO 2 sensor module (back) 5 Display 6 Keyboard7 Reset button (back) 8USB mains unit socket4 Product description10Temperature/humidity module option9 Temperature/humidity sensor10 Connection for temperature/humidity module to the testo 315-3 Key functionsButton Duration Status FunctionShortReading display • Holds measuringvalues• Displays maximumvalues• Displays minimumvalues• Displays currentvaluesConfigurationmodeConfirms entry, nextsettingLongReading display(current value)Opens configurationmodeReading display(maximumvalue/minimumvalue/hold value)Resets maximumvalue, minimum value,hold valueConfigurationmodeExits configurationmode (changes arestored)ShortInstrument is off Switches instrumentonInstrument is on Switches on displayillumination/switchesoff display illuminationThe displayilluminationswitches offafter 3 minutesif no keys arepressed.Long Instrument is on Switches instrumentoff4 Product descriptionButtonDuration StatusFunctionShort Reading display,only if humiditymodule isconnected• Displays humidityand temperature• Displays dewpoint/temperature• Displays CO/CO2 ConfigurationmodeChangesoption/increases valueLong Reading display Sets zeropoint/deletes zeropoint ConfigurationmodeIncreases valuequicklyShort Reading display Prints data ConfigurationmodeChangesoption/decreasesvalueLong Reading display Switches on datatransfer mode via IrDaor Bluetooth®(optional) / switchesoff data transfer modevia IrDa or Bluetooth®(optional) ConfigurationmodeDecreases valuequickly4 Product description4.1.2. Display1 Status informationIconFunctionBattery capacity: 100% / 66% / 33%, <10%Micro USB mains unit is connected.Alarm off, alarm onFlashes: the alarm threshold is exceededData transferData transfer enabledLights up: information Flashes: error messageDifferential measurement calibration Flashes: adjust CO 2 sensor in fresh air.CO 2 sensor should be adjusted in fresh air after 120 operating hours.2 Measuring functions3 Display of CO measuring value, humidity, dew point, date (dayand month), hours and minutes, measurement parameter 4 Display of CO 2 measuring value, adjustment value, temperature, year, measurement parameter.5 Settings4 Product description4.2.Basic propertiesPower supplyPower is supplied to the testo 315-3 via:• Rech. batt., type: Li-Po rechargeable battery pack, 3.7 V/1,500 mAh (can be replaced via Testo Service)• USB mains unit inc. cable (0554 1105), 5,0 V/1.000 mAhWith an attached mains unit, power is supplied automatically viathe mains unit and the instrument's rechargeable battery ischarged. To avoid destroying the rechargeable battery, only chargeit at an ambient temperature of 0 to 45 °C/32 to 113 °F.5 First steps5First stepsSwitching onConnect the humidity/temperature module (0636 9725)before switching the instrument on (option).1. Switch the instrument on: press [] briefly.- All segments of the display light up for around 1 s.- The warm-up phase then occurs for 30 s. (information indisplay: warm-up).During first commissioning, the configuration menu isopened automatically, see Settings.- The current measurement values are displayed. The instrument is operational.The measurement value of the CO sensor is displayedin the upper line.The measurement value of the CO2 sensor is displayedin the lower line.Switching off2. Switch off the instrument: press and hold down [].6 Using the product6Using the product6.1.Performing settingsOpening the configuration menu✓The instrument is in measurement view.> Open configuration menu: press and hold down [] untilconfiguration mode appears.Selecting, opening and setting functions> To select the next function: press [] / [] briefly.> To enable the selected function: press [] briefly.> To adjust the open function: press [] / [] briefly.> To cancel the function: press [] briefly until the displaychanges.Adjustable functionsEnsure correct settings: all settings are transferredimmediately. There is no Cancel function.Settings and display of temperature and humidity are onlypossible if the humidity module was connected beforeswitching on.Function Setting options/commentsalarm off or onon selected:• Enables/disables acoustic signal• Sets CO limiting value• Sets CO2 limiting valueauto-off off or onon selected:Sets valuedate Sets year, month, daytime off or on selectedSelects 24h or am or pm format, sets hoursand minutesunit CO2: Vol% or ppmtemp./dewpoint: °F, °C/°Ftd, °Ctd6 Using the productFunction Setting options/commentscalibration CO2yes (adjust)or no (do not adjust)yes selected:CO2 value is adjusted to the nominal valueand stored (only possible with a current CO2value < 650 ppm)The calibration/adjustment must becarried out in fresh air that has anormal CO2 concentration ofapprox. 400 ppm.To avoid incorrect calibra-tion/adjustment values, calibrationshould, for example, not be carriedout on busy roads or in closed rooms.Ensure that, before and during thecalibration/adjustment, no exhaled airreaches the instrument.Before completing the adjustment, theinstrument should be left in the freshair for 3-4 minutes.calibrationhumidityyes or noyes selected:Humidity values are aligned to the alignmentpoints 11.3 % RH and 75.3 % RH and saved.To adjust to the alignment points, usethe testo Control and alignment setfor humidity sensors (0554 0660).> Cancel configuration menu: press and hold down [] untilconfiguration mode display switches to the measurementview.6.2.Measuring✓testo 315-3 is on.- Display of the current CO and CO2 measuring values.Switching to humidity, temperature and dew point value isonly possible if a humidity/temperature module isconnected.> Display humidity and temperature value: press [] briefly.6 Using the product> Display dew point: press [] briefly.Delta measurement✓testo 315-3 for CO and CO2 is in the measurement view.> Call up Delta measurement menu: press [] until the display changes.- Delta measurement is carried out ([] lights up). The current measuring values are zeroed as a reference.> To cancel the function: press and hold down [] until thedisplay switches to the measurement view.Hold/Max/Min✓testo 315-3 is in measurement view.The measuring values for Hold/Max/Min can be called up for:• CO and CO2• Temperature and humidity (if temperature and humidity module is connected)• Dew point (if temperature and humidity module is connected)> Call up Hold function: press [] briefly.- The current measuring values are held.> Call up Max function: press [] briefly.- The maximum measuring values are displayed.> Call up Min function: press [] briefly.- The minimum measuring values are displayed.> Set Hold/Max/Min values to the current value: press and hold down [].- Measuring values flash several times and are updated.6.3.Printing the measurement results✓testo 315-3 is on.> Call up Print function: press [] briefly.- The data is transferred to the printer via the IrDA interface.- [] lights up.7 Maintaining the product6.4.Bluetooth® and IrDA data transferData is transferred via Bluetooth® if both instruments havethis interface. Otherwise, data is transferred via the IrDAinterface.Data can currently be transferred to the following testo measuringinstruments:• testo 330 (0632 3306 / 0632 3307) from firmware version V1.11• testo 330 (0632 3304 / 0632 3305) from firmware version V1.63✓testo 315-3 is on and a measurement has been carried out.> Switch on data transfer: press and hold down [].- [] lights up. Data is transferred.Please refer to the operating instructions for the relevanttesto measuring instrument.7Maintaining the productCharging the rechargeable battery1. Connect the mains unit connector to the instrument's microUSB socket.2. Connect the mains plug to the mains socket.The charging process will start automatically. Charging is indicatedby a change in the battery icon segments. lights up when thebattery is fully charged.Cleaning the instrumentDo not use any aggressive cleaning agents or solvents! Mildhousehold cleaning agents and soap suds may be used.8 Tips and assistance8Tips and assistance8.1.Questions and answersQuestionPossible causes/solutionsflashes and E145 lights up The instrument temperature is outside the permissible range:> Let the instrument warm up or cool down.flashes and E161 lights uporflashes and E419 lights uporflashes and E999 lights up A serious error has occurred:> Contact your dealer or the Testo Customer Service.flashes and E290 lights up The current CO2 measuring value is too high for an adjustment: Acknowledge the error message with [Hold/Max/Min] and carry out theadjustment again in fresh air (CO2measuring value < 650 ppm).flashes and E420 lights up Fan is blocked> Please check if any item sticks in the ventilation slot on the back ofthe instrument. After 90 seconds, the fan is started again.> If there is nothing directly visible, please contact your dealer or theTesto Customer Service., calibration and CO2 flash CO2 sensor must be adjusted> Adjust CO2 sensor in fresh air.flashes Battery capacity too low> Connect the mains unit.8 Tips and assistanceQuestion Possible causes/solutionsand flash, E401lights up Rechargeable battery is empty,instrument will soon switch off automatically.> Connect the mains unit (the warning can be deleted with [Hold/Max/Min]).and flash Recharging the battery is impossible:> Contact your dealer or the Testo Customer Service.Can the battery be replaced?The battery cannot be replaced.> Contact your dealer or the Testo Customer Service. Can the battery be charged via the USB port of a laptop or PC?The battery cannot be charged. Can Bluetooth ® be retrofitted?Bluetooth ® functionality cannot be retrofitted. Can humidity, temperature and dew point measurements be carried out?Humidity, temperature and dew point measurements are possible with the temperature/humidity module (accessory). The display is illuminated also after the testo 315 - 3 is switched off The display illumination goes out as soon as the instrument has shut down. While the display is illuminated, the instrument cannot be switched on. The CO 2 measuring value is displayed as 0000The current measuring value is over 10,000 ppm. The CO2 reading is implausible. The gas opening on the back of the instrument might be covered. This means that a correct CO2 measurement is not possible.If we could not answer your question or the solutions given during troubleshooting did not help: please contact your dealer or Testo Customer Service. Contact data see back of this document or website /service-contact .8 Tips and assistance218.2. Calibration intervalTesto recommends that calibration of the measuring instrument is carried out once a year by Testo Customer Service or a service centre authorised by Testo.8.3. Accessories and spare partsDescription Article no. testo 315-3 without Bluetooth ® (incl. micro USB mains unit, cable USB A - USB micro B)0632 3153 testo 315-3 with Bluetooth ® (incl. micro USB mains unit, cable USB A - USB micro B)0632 3154 Temperature/humidity module0636 9725 Topsafe0516 0223 USB mains unit inc. cable0554 1105 Basic printer0554 0549 Spare printer paper (6 rolls) 0554 0568 Control and calibration set for humidity sensors (11.3 %RH and 75.3 %RH)0554 06600970 3153 en 04。

红外K型温度计模型CA876用户手册说明书

红外K型温度计模型CA876用户手册说明书

Owner’s RecordThe serial number for the Model CA876 is located inside the battery com-partment of the instrument. Please record this number and purchase date for your records.INFRARED K-THERMOMETER MODEL CA876CATALOG #:2121.34SERIAL #: _______________________________________________ PURCHASE DATE: ________________________________________ DISTRIBUTOR: ___________________________________________Table of Contents1.INT RODUCT ION (2)1.1 International Electrical Symbols (3)1.2 Receiving Your Shipment (3)1.3 Ordering Information (3)2.PRODUCT FEATURES (4)2.1 Description (4)2.2 Button Functions (5)2.2.1 Center (Yellow) Button - MEAS (5)2.2.2 Mode Button (5)2.2.3 Back-light and ▲ Button (6)2.2.4 (6)3.SPECIFICAT IONS (7)3.1 Environmental Specifications (7)3.1.1 Infrared (7)3.1.2 K-Type (8)3.2 General Specifications (9)3.3 Safety Specifications (9)4.OPERAT ION (10)4.1 Infrared Measurement Considerations (10)4.2 Recommendations before Operating (11)4.3 Infrared Operation (12)4.4 K-Thermocouple Operation (12)4.5 Setting the Temperature Scale: °C or °F (13)4.6 Continuous Measurement (13)5.MAINT ENANCE (14)5.1 Replacing the Battery (14)5.2 Cleaning (14)Repair and Calibration (15)Technical and Sales Assistance (15)Limited Warranty (16)Warranty Repairs (16)CHAPTER 1 INTRODUCTION1.1 International Electrical SymbolsThis symbol on an instrument indicates a WARNING and that the operator must refer to the user manual for instructions before operating the instrument. In a manual, the symbol preceding instructions indicates that if the instructions are not followed, bodily injury, installation/sample and product damage may result.Laser Radiation - DO NOT look into the laser beam.Laser Output < 0.5mW, 670nm wavelength.1.2 Receiving Your ShipmentUpon receiving your shipment, make sure that the contents are consistent with the packing list. Notify your distributor of any missing items. If the equipment appears to be damaged, file a claim immediately with the car-rier and notify your distributor at once, giving a detailed description of any damage. Save the damaged packing container to substantiate your claim.1.3 Ordering InformationInfrared Thermometer Model CA876................................Cat. #2121.34 Includes K-thermocouple, 9V Alkaline battery, rugged, shockproof, protective safety holster and user manual.CHAPTER 2PRODUCT FEATURES2.1 DescriptionThe AEMC® Model CA876 is a portable, easy-to-use, compact-sized digi-tal infrared and K-thermocouple input thermometer, designed for simple one-hand operation. It uses an infrared sensor with a user adjustable emmisivity setting.The thermometer is simply aimed at the target to be measured without any physical contact when in the IR mode (non-contact temperature mea-surement). An aiming laser enables the user to pinpoint the target. The K-type thermocouple input may also be used. The thermometer includes a programmable Alarm, which triggers on Hi or Lo, and a MIN or MAX Hold function. The thermometer is supplied with a protective holster, built-in sensor, and a K-type bead thermocouple. It also features a back-light LCD display, an automatic data HOLD function, and Auto-OFF.1.3½ digit display2.Mode selector3.Back-light/Increase buttonser/Decrease button5.Power/Measure button2.2 Button Functions2.2.1 Center (Yellow) Button - MEASPress this button to turn the thermometer ON and perform a measure-ment. When the button is released, the meter will automatically HOLD (freeze) the last reading on the display for approximately 15 to 20 seconds before automatically shuting OFF.2.2.2 Mode ButtonPressing this button makes the thermometer enter and advance though several functions including some programmable functions. The MODE button is used to select a specific function (e.g. MIN, MAX) or to adjust an Alarm or the emissivity.To enter the mode program, press MODE once. SET is displayed in lower right hand corner.When pressed consecutively, it will scroll through the following settings:ε→ε (Set) → ALM Hi (Set) → ALM Lo (Set) → MAX → MIN→ K→εSET is displayed in the lower right hand corner of the display when a selected function is programmable (ε, ALM Hi, ALM Lo).ε: Emissivity may be adjusted with the and buttons to match a particu-lar target. We recommend leaving it at 0.95 for general use.ALM Hi: The Hi alarm set point is adjusted using the and buttons. When this set point is reached the beeper will sound and ALM HI will be displayed. ALM Lo: The Lo alarm set point is adjusted using the and buttons. When this set point is reached the beeper will sound and ALM Lo will be displayed.MAX: The Max value measured will be displayed during measurement. While in the MAX mode and when measuring, the MODE button will toggle the mea-surement between MAX – MIN – Present reading – MAX.MIN: The Min value measured will be displayed during measurement. While in the MIN mode and when measuring, the MODE button will toggle the mea-surement between MIN – Present reading – MAX – MIN.K: The lower display will indicate the reading from the K-type thermocouple. NOTE: If the thermometer is OFF, pressing the “MEAS” button for more than 4 seconds will set the thermometer in the MIN/MAX record mode when pow-ered up.2.2.3 Back-light and ButtonPress the button to turn the Back-light ON. Press again to turn OFF. In the settings ε, ALM Hi and ALM Lo, the button increases the numeri-cal value displayed in the lower smaller display area. If held down, the value change will increase in speed.2.2.4 Laser and Buttondisplayed in the upper left hand corner when the laser is ON.Remove the laser cover before use.The laser is activated when the “MEAS” button is pressed during a mea-surement.In the settings ε, ALM Hi and ALM Lo, the button decreases the numeri-cal value displayed in the lower smaller display area. If held down, the value change will decrease in speed.NOTE: There is no ON/OFF button. The meter turns ON when the center “MEAS” button is pressed, and will automatically shut OFF after approxi-mately 15 to 20 seconds.CHAPTER 3SPECIFICATIONS3.1 Environmental Specifications3.1.1 INFRAREDTemperature Scale:Celsius (°C) or Fahrenheit (°F) user-selectableTemperature Range:-4°F to 1022°F (-20°C to 550°C)Display Resolution:1°F / 1°CAccuracy:< 212°F (100°C) ± 10°F (5°C)>212°F (100°C) ± 2% of Reading or ± 6°F (3°C), whichever is greater @ 64.4 to 82.4°F (18 to 28°C) ambient operating temperature. Temperature Coefficient:Changes in accuracy operating temperature above 82.4°F/28°C or below 64.4°F/18°C: ±0.2% of Reading or ± 0.36°F/0.2°C, whichever is greater. Response Time:1 secondLaser:Red, <0.5mW (670nm) Class II, 2 - 50 ft rangeSpectral Response:6 to 14μm nominalEmissivity:************,user-selectablefrom0.10to1.00Detection Element: ThermopileOptical Lens: Fresnel LensField of View (FOV) ratio: Array 10:1 (Distance : Diameter)The FOV is the ratio of the dis-tance to the target to the targetdiameter. When the targetdiameter is small, it is impor-tant to bring the thermometercloser to the target to ensurethat only the target is mea-sured, excluding the surroundings. Remember that the measurement size is one-tenth the distance to the target. For example, if the thermometer is at 10", then the size measured is 1".3.1.2 K TYPETemperature Scale: Celsius (°C) or Fahrenheit (°F) - User-selectable Measurement Range: -40° to 2000°F or -40° to 1350°CResolution: 0.1°F/°C or 1°F/°CAccuracy: -328°F to 1999°F: ± 0.1% of Reading ± 2°F plus T.C.-199°C to 1370°C: ± 0.1% of Reading ± 1°C plus T.C. (Accuracy is specified for operating temperatures over the range of 64° to 82°F (18° to 28°C), for 1 year, excluding the sensor)Temperature Coefficient:0.1 times the applicable accuracy specification per °C from 0° to 18°C and 28° to 50°CInput Protection:24V DC or 24Vrms maximum input voltage on any combination of inputs Sample Rate: 2.5 times per secondInput Connector:Standard miniature thermocouple connectors (flat blades spaced 7.9mm, center-to-center)Temperature Response:Temperature indication follows Reference Temperature/Voltage TablesN.I.S.T. Monograph 175 Revised to ITS-90 for K-type thermocouples3.2 General SpecificationsDisplay: 2000-count, 3½ digit liquid crystal display (LCD) with maximum reading of 1999Low Battery Indication:is displayed when battery voltage drops below required levelSample Rate: 2.5 times per second, nominalOperating Temperature: 32° to 122°F (0° to 50°C) at < 80% RH Storage Temperature:-4° to 140°F (-20° to 60°C), 0 to 80% RH with battery removedAuto Power Off: 15 seconds approxAltitude: 2000m maxInput Protection: 24V DC or 24rms maximum input voltage on any combi-nation of inputsBattery: Standard 9V battery (NEDA 1604, 6LR61 or equivalent) Battery Life: 100 hours (continuity) typical with carbon zinc battery (back-light not illuminated)Dimensions: 6.81 x 2.38 x 1.5" (173 x 60.5 x 38mm)Weight: Approx 9 oz (255g) including battery3.3 Safety SpecificationsEN 61010-1 (1995-A2), Protection Class IIIOvervoltage Category (CAT III, 24V), Pollution Degree 2Indoor Use*All specifications are subject to change without noticeCHAPTER 4OPERATION4.1 Infrared Measurement ConsiderationsMEASUREMENT THEORYEvery object emits infrared (IR) energy proportional to its temperature. By measuring the amount of this radiant energy, it is possible to determine the temperature of the emitting object. Infrared radiation is invisible light (electromagnetic radiation), which easily travels through air and is easily absorbed by solid matter. An IR thermometer, which operates by detecting infrared radiation, can accurately measure an object surface temperature without touching it and independently of the air temperature or the mea-surement distance.Infrared radiation, which is emitted from the object, is focused into an infra-red radiation sensor through an optical system. This system includes an optical lens, which is transparent to infrared radiation, and a 5.3μm cut off filter. The output signal from the infrared radiation sensor is input to an electronic circuit, along with the output signal from a standard temperature sensor, to calculate the temperature and display it on the meter display. EMISSIVITYAll objects emit invisible infrared energy. The amount of IR energy emitted is proportional to the object’s temperature and its natural ability to emit IR energy. This ability, called emissivity, is based upon the object material type and its surface finish. Emissivity values range from 0.10 for a very reflective object to 1.00 for a black body. Factory set emissivity value of 0.95 will cover 90-95% of typical applications.If frost or other material/substance covers the measured surface, clean it to expose the surface. If the surface to be measured is highly reflective, apply dull masking tape or matte black paint over the surface. If the ther-mometer seems to be giving incorrect readings, check the front sensor. There may be condensation or debris obstructing the sensor. Only clean per instructions.Material Emmisivity Material Emmissivity Asphalt0.90 to 0.98Cloth (black)0.98Concrete0.94Human skin0.98Cement0.96Lather (Soap)0.75 to 0.80 Sand0.90Charcoal (powder)0.96Earth0.92 to 0.96Lacquer0.80 to 0.95 Water0.92 to 0.96Lacquer (matte)0.97Ice0.96 to 0.98Rubber (black)0.94Snow0.83Plastic0.85 to 0.95 Glass0.90 to 0.95Timber0.90Ceramic0.90 to 0.94Paper0.70 to 0.94 Marble0.94Chromium oxide0.81Plaster0.80 to 0.90Copper oxide0.78Mortar0.89 to 0.91Iron oxide0.78 to 0.82 Brick (red)0.93 to 0.96Textiles0.904.2 Recommendations before Operating•If the measured surface target diameter is less than 2"/50mm Ø, then place the sensor as close as possible to the target surface (<20"/50cm away). See Field of View (FOV) information under Specifications.•If the target surface is covered with frost or any matter, clean it before taking a measurement.•If the target surface is highly reflective put some matte tape, or matte paint, over it before measuring.•If the thermometer is erratic, or seems not to be measuring properly, make sure that the sensor is clean and not covered by condensation.4.3 Infrared Operation1.Press and hold the yellow measurement button - MEAS.The thermometer will display SET briefly on the main display while it auto-checks.2.Aim the Thermometer towards the target.3.If using the aiming laser, remove the laser cover, and press theButton to turn the laser ON and OFF. The laser is activated when MEAS is pressed.4.Infrared thermometer sensors need a certain time to stabilize toambient temperature. Remember to let the IR meter reach ambient if brought in from different temperature environment.5.Press the button to turn ON the back-light.6.If the measured temperature is outside the measurement range,will be displayed.7.The thermometer will continue measuring as long as the MEAS buttonis pressed. When the button is released the measurement will be held in the display for 15-20 seconds. HOLD is displayed in the lower left-hand corner of the display.8.The IR thermometer will shut OFF automatically after 15-20 seconds.4.4 K-Thermocouple Operation1.Connect the K-thermocouple to the instrument input.2.Press MEAS to turn the thermometer ON.3.Press the MODE button (six times) to enter the thermocouple mode. Kwill be displayed in lower right hand corner of the display.4.Put the thermocouple near or on the sample tested.5.Press MEAS button to measure. Thermocouples need a certain timeto respond. Take the reading when the measurement has stabilized.The reading is displayed in the smaller lower display in front of the K symbol. The main larger display is the IR temperature reading.6.The thermometer will continue measuring as long as the MEAS buttonis pressed. When the button is released, the measurement will be held in the display for 15-20 seconds.7.When finished, remove the thermocouple from the sample, andunplug the thermocouple from the meter. The thermometer will shut OFF automatically after 15-20 seconds.Note: IR measurements are active at the same time as the K t/c measure-ments.4.5 Setting the Temperature Scale: °C or °FThe temperature scale is displayed on the upper part of the display.To select the temperature scale:•°C: When the thermometer is OFF, hold down the button, then press the MEAS button. °C will be displayed.•°F: When the thermometer is OFF, hold down the button, then press the MEAS button. °F will be displayed.The selected temperature scale will remain until changed by the user.4.6 Continuous MeasurementThe user may want to leave the thermometer ON to measure over an extended time period without having to keep pressing the MEAS button. To enter the continuous mode:•Turn the thermometer OFF. Hold down the MODE button, then press the MEAS button. This will set the thermometer in the continuous mode. HOLD will not be displayed in the continuous mode and the laser sighting will not run.•When finished, press MEAS once. HOLD will be displayed and the thermometer will shut down in 15-20 seconds. Alternatively, press MEAS again to re-enter the continuous mode.CHAPTER 5MAINTENANCEUse only factory specified replacement parts. AEMC® will not be held responsible for any accident, incident, or malfunction following a repair done other than by its service center or by an approved repair center.5.1The symbol appears on the LCD display when replacement is needed. Replace with a standard 9-volt alkaline battery (NEDA 1604, 6LR61).To replace the battery:•Turn the meter OFF.•Remove the rubber holster.•Remove the screw from the back of the meter and lift off the bat-tery cover.•Replace the battery, then put the rear cover and holster back on.5.2 Cleaning•Use a soft cloth lightly dampened with soapy water.•Rinse with a damp cloth and then dry with a dry cloth.•Do not use any abrasives or solvents.•Do not let any liquid enter the case or sensor area.Repair and CalibrationTo ensure that your instrument meets factory specifications, we recommend that it be scheduled back to our factory Service Center at one-year intervals for recalibration, or as required by other standards or internal procedures. For instrument repair and calibration:You must contact our Service Center for a Customer Service Authorization Number (CSA#). This will ensure that when your instrument arrives, it will be tracked and processed promptly. Please write the CSA# on the outside of the shipping container. If the instrument is returned for calibration, we need to know if you want a standard calibration, or a calibration traceable to N.I.S.T. (Includes calibration certificate plus recorded calibration data).Ship To:Chauvin Arnoux®, Inc. d.b.a. AEMC® Instruments15 Faraday DriveDover, NH 03820 USAPhone: (800) 945-2362 (Ext. 360)(603) 749-6434 (Ext. 360)Fax: (603) 742-2346 or (603) 749-6309E-mail:***************(Or contact your authorized distributor)Costs for repair, standard calibration, and calibration traceable to N.I.S.T. are available.NOTE: You must obtain a CSA# before returning any instrument.Technical and Sales AssistanceIf you are experiencing any technical problems, or require any assistance with the proper operation or application of your instrument, please call, fax or e-mail our technical support team:Chauvin Arnoux®, Inc. d.b.a. AEMC® InstrumentsContact:Phone: (800) 945-2362 (Ext. 351)(603) 749-6434 (Ext. 351)Fax: (603) 742-2346E-mail:********************Limited WarrantyThe Model 876 is warranted to the owner for a period of 2 year s from the date of original purchase against defects in manufacture. This limited warranty is given by AEMC® Instruments, not by the distributor from whom it was pur-chased. This warranty is void if the unit has been tampered with, abused or if the defect is related to service not performed by AEMC® Instruments.For full and detailed warranty coverage, please read the Warranty Coverage Information, which is attached to the Warranty Registration Card (if enclosed) or is available at . Please keep the Warranty Coverage Information with your records.What AEMC® Instruments will do:If a malfunction occurs within the warranty period, you may return the instrument to us for repair, provided we have your warranty registration information on file or a proof of purchase. AEMC®Instruments will, at its option, repair or replace the faulty material.Warranty RepairsWhat you must do to return an Instrument for Warranty Repair:First, request a Customer Service Authorization Number (CSA#) by phone or by fax from our Service Department (see address below), then return the instrument along with the signed CSA Form. Please write the CSA# on the outside of the shipping container. Return the instrument, postage or shipment pre-paid to:Ship To:Chauvin Arnoux®, Inc. d.b.a. AEMC® Instruments15 Faraday Drive • Dover, NH 03820 USAPhone: (800) 945-2362 (Ext. 360)(603) 749-6434 (Ext. 360)Fax: (603) 742-2346 or (603) 749-6309E-mail:***************Caution: To protect yourself against in-transit loss, we recommend you insure your returned material.NOTE: You must obtain a CSA# before returning any instrument.Notes:03/1899-MAN 100246 v6Chauvin Arnoux®, Inc. d.b.a. AEMC® Instruments15 Faraday Drive • Dover, NH 03820 USA • Phone: (603) 749-6434 • Fax: (603) 742-2346。

performancemetrics

performancemetrics

P ijPers : P erformance metricsFrank P Pijpers has a close look at performance metrics, with the forthcoming Research Assessment Exercise in mind.It can be argued that the quality of science cannot be measured purely by quantitative metrics, and certainly not by a single one that is supposed to fit all. Experimental groups and theoretical groups have different types of output and it is difficult to envisage a way in which the impact or quality of these can be compared either objectively or subjectively. A single instrument built by an experimental group may well lay the foundation for very many scientific results and papers by opening up a new field, and a single the-oretical paper may well cause a paradigm shift. In those branches of physics where close links with industrial applications can be forged, yet another measure of success would need to be found. Even for those research groups for which the primary scientific output could be considered to be scientific papers, the rate at which papers are published – and therefore also are cited – depends on the field. Even within the area of astrophysics, in some subfields papers are published at rather larger rates than in others. No-one could reason-ably claim that the former subfields have greater scientific significance or importance than the lat-ter. In this sense it seems that the conclusions of Pearce (1994) on “good researchers” should not be used without additional judgment, as indeed pointed out by that author.To illustrate this point I have compiled some citation statistics by making use of the Smith-sonian/NASA Astrophysics Data System (ADS) which also tracks citation statistics (the UK mir-ror site can be found at /). The number of bibliographic entries, both refereed and non-refereed, per author in the ADS covers a wide range. Evidently there are many authors with one or a few entries, and at the other end of the scale there is an author for whom there are in excess of 11 000 entries. A histogram showing the distribution of authors in the ADS is shown: each bin shows the amount of authors, against the number of bibliographic entries per author.An author with very many entries is likely to be better known, but it does not seem likely that the scientific impact of, for example, each of 11 000+ entries is the same. The number of citations a paper receives by itself does not necessarily reflect impact either. A paper that is authored by many can easily get a high citation ranking if each ofthe authors subsequently writes a further paper in which the original paper is cited. If an average paper in a given field of research has a certain a priori likelihood of being cited, i.e. the standard impact of a paper in that field, the number of citations it receives must scale with the number of papers published in that field. Also, the rate at which a typical author publishes will reflect the habits of that subfield so that one should expect that for an unexceptional author the number of citations scales with the number of papers that this author publishes to some power that lies between 1 and 2. N expected citations ∝ N γpapers published 1<γ < 2 (1) From the ADS it is possible to extract the nor-malized citation rate of papers of any author, which is calculated by taking for each paper the number of citations it has received, divided by the number of authors, and then summed over all entries/papers on which this author appears. This to some extent accounts for differences between many-author and single-author papers.I have randomly sampled, from the distribu-tion shown in figure 1, authors for whom I have subsequently extracted this normalized citation count for those bibliographic entries which are refereed papers. The sampling was carried out as a constant fraction 0.02 per bin, rather than for the distribution as a whole in order to reduce the statistical fluctuations and cover the entire range reasonably evenly. Also, authors with fewer than 50 bibliographic entries were omit-ted from the analysis in order to reduce statisti-cal fluctuations in citation counts. The random sampling within each bin sometimes produced a name for which bibliographic entries could notScientific output varies between research fields and between disciplines within a field such as astrophysics. Even in fields where publication is the primary output, there is considerable variation in publication and and hence in citation rates. Data from the Smithsonian/NASA Astrophysics Data System is used to illustrate this problem and argue against a “one size fits all” approach to perfomance metrics, especially over the short time-span covered by the Research Assessment Exercise.Performance metrics1: Histogram of the number of bibliographic entries per author in the ADS against thenumber of authors. The dashed line shows an exponential distribution.1041000100101n o . o f e n t r a n t s100 200 300 400 500 600 700 no. of entries per entrantP ijPers: P erformance metricsuniquely be assigned. Whenever this occurred the name was discarded and a different, randomly selected, name was substituted. The diagram resulting from this is shown on log–log scales in which the solid line is the least-squares fit with a γ = 1.52. The behaviour appears to correspond to the expectation outlined above. In absolute terms, authors appearing in the upper right-hand corner of this diagram – such as Sir Martin Rees, for example – have great visibility and therefore impact in astrophysics. An equally useful measure of the relative importance of individual authors or groups of authors is the distance above or below the fitted function, indicating a relatively high or low number of citations given the number of papers published by that author. An example of this is given by the strongest upward outlier to this distribution that I have found, which is Alan Guth, whose seminal work in inflation theory clearly has a very wide impact in current cosmol-ogy. His entry is not part of the random sample, and therefore also not present in figure 2, but cor-responds to a 3−s upwards outlier. The bottom panel shows the distribution orthogonal to thefitted function, with positive offset correspond-ing to authors with normalized citation ratesabove the fit, and negative offset to those belowthe fit, with a Gaussian distribution overplotted.The distribution is quite evidently skewed, witha long leftward tail, and possibly somewhat morepeaked than a Gaussian distribution.skewness = −1.09 ± 0.25 (2)kurtosis = 1.4 ± 0.6 (3)Creating a distribution such as this for adepartment or the national community requireshaving available a complete citation and publica-tion record. Even if this were to exist, as arguedabove, it does not measure reliably the perform-ance of those groups for which scientific publica-tion is not the only or primary scientific output.Any attempt at measuring performance solely inthis manner is, therefore, impossible in practice,and flawed in principle.Moreover, a measure such as this covers theentire career of individuals. It is known thatexceptional papers distinguish themselves pri-marily through being cited consistently over dec-ades, rather than passing through a brief peakof citations and then disappearing from citationrecords. Even if the above were a reliable meas-ure, it is not amenable to adaptation for the veryshort-term measures sought for RAE assessmentexercises and the like. To the author’s knowl-edge there is no metric that reliably measuresperformance over periods as short as five years,and none that is an indicator of future perform-ance. Specifically it seems inappropriate to usenumbers of citations, even after “normalization”in the sense of the ADS, as a reliable direct metricof impact. ●Frank P Pijpers is a lecturer in the Space andAtmospheric Physics Group at Imperial CollegeLondon;*********************.uk.This research made use of NASA’s Astrophysics DataSystem Bibliographic Services.ReferencePearce F 1994 A&G45 2.15.2: Upper panel: for each author the number of normalized citations summed over all refereed papers is plotted against the number of refereed papers published by that author. Lower panel: the distribution around the least-squares fit shown as dashed line in the upper panel.104100010010totalofnormalizadcitations10 20 50 100 200 500total refereed papers302010–2 –1 0 1 2。

Two-wayRepeated-MeasuresANOVA

Two-wayRepeated-MeasuresANOVA

Research Methods II: Spring Term 2002Using SPSS: Two-way Mixed-design ANOVASuppose we have performed an experiment investigating sex differences in the effects of caffeine on memory. We have two independent variables. On one of these, caffeine consumption, we have repeated measures (i.e. it is a within-subjects variable). There are three levels of caffeine consumption, low, medium and high, and each subject does all three conditions. On the other IV, sex of subject, we have independent measures (i.e. it is a between-subjects variable): there are two levels of this IV, male and female. A two-way mixed-design ANOVA (with independent measures on sex and repeated measures on caffeine consumption) is the appropriate test in these circumstances.1. Entering the Data:As with the previous repeated-measures ANOVA's in SPSS, we have to use a separate column for each condition of the repeated-measures variable (caffeine consumption in this example),. We also have to include a column containing code-numbers that tell SPSS which group of the independent-measures variable each subject belongs to. In this example, our data might look like this:The first column, "subject", is merely to remind you there is one row per subject. The next three columns, "lowcaff", "medcaff" and "highcaff", contain each subject's memory-test score for the low-, medium- and high-caffeine consumption conditions respectively. The final column, "sex", tells SPSS the sex of each subject, (I've used "1" to signify "male" , and "2" to designate "female").Running the ANOVA:Having entered your data, do the following.(a) Click on "Analyze"; then click on "General Linear Models"; then click on "Repeated Measures". The "Repeated Measures Define Factor(s) dialog box appears (the same one as you used to perform one-way and two-way repeated measures ANOVA's).(b) For each of your repeated-measures IV's, you have to make entries in this box. You have to tell SPSS the name of each repeated-measures IV, and how many levels it has, in just the same way as you would for previous repeated-measures ANOVA's.In our example, we have just one repeated-measures IV: caffeine consumption. Replace the words "factor 1" with a more meaningful name that describes the variable - for example, "caffeine". Then enter the number of levels in the next box down. We have three levels of caffeine consumption, so we enter a "3" in the box. Now click on the button labelled "Add", and SPSS will put a brief summary of this IV into the box beside the button. In this case, SPS S will put "caffeine(3)" into the box.(c) Now we have to tell SPSS which columns contain the data needed for the repeated-measures variable in our ANOVA. Click on the button labelled "Define". The "Repeated Measures Define Factor(s)" dialog box disappears, and is replaced with the "Repeated Measures " dialog box.On the left-hand side of the dialog box is a box containing the names of the columns in your SPSS data-window. On the right-hand side, is a box which contains empty slots (shown as _?_[1] for example).. Move each of the column names for your repeated-measures variable, on the left, into its correct slot on the right.Take the top slot in the right-hand box: it's got [1] next to it. This means that this is the slot for the name of the column that contains the data for the first level of your repeated-measures IV. In our example, this means the column labelled "lowcaff", which contains the data for " low caffeine consumption".The next slot, [2], is for "medcaff", as this contains the data for " medi um caffeine consumption". The third slot, [3]. is for "highcaff" (the data for high caffeine consumption). For each highlighted slot, click on the appropriate column name; then click on the arrow-button between the boxes, to enter the column name into the slot. Do this for each slot in turn. All this is pretty much as in previous repeated-measures ANOVA's.(d) Now we have to deal with the independent-measures IV: "sex" in this case. Click on the name of the independent-measures IV , in the list in the left-hand box. Then click on the arrow-button next to the box labelled "Between-Subjects Factor(s)", to move the label into that box.(e) If your between-subjects factor (sex) had more than two levels you could have asked SPSS to perform a Student-Newman Keuls post hoc test by clicking on "Post hoc…". But in this case, we do not need post hoc tests for sex so we do not click here.(f) To produce some post hoc tests for our repeated measures factor, we click on "Contrasts…", and then the arrow to select "Repeate d". Then "Change" and "Continue".(g) Next we click on "Options…", "Descriptive Statistics", "Continue".(h) Finally, click "OK" for SPSS to run the analysis.The SPSS Output:General Linear Modelsignificant for our repeated measures variable, so the sphericity assumption is satisfied, and we do not have to worry about the Huyn-Feldt correction.]and in interaction with the independent-measures IV. We use the rows labelled "Sphericity Assumed". Here, we have a highly significant effect of caffeine consumption. There is no significant interaction between sex of subject and caffeine consumption.the interaction. For the main effect of caffeine, the low and medium doses did not differ significantly, but the medium dose differed from the high dose.][Finally, the results for the independent-measures IV, which in this case was sexof subject. In this example, there is a highly significant main effect of sex of subject. "Intercept" refers to whether the overall mean for the memory score is above zero.]Interpretation:Overall, females scored higher on the memory test than males (main effect of subjects' sex: F(1, 10) = 15.33, p=0.003). Increased caffeine consumption improved memory scores for both males and females (main effect of caffeine consumption: F(2, 20) = 12.84, p<0.0005). The interaction between sex and caffeine consumption failed to reach statistical significance, F(2, 20) = 3.12, p =.066.Post hoc te stsIn this type of mixed design, as for the pure between subjects and pure repeated measures cases, one may wish to conduct post hoc tests to explore the pattern of significantmain effects or interactions.(a) Main effects. For the main effect of gender, post hoc tests do not apply because itonly has two levels. For the main effect of caffeine, we could, as before, perform all possible pair-wise comparisons if we wanted more than the tests provided by SPSS above. That is, averaging over gender, we could compare low with medium, low with high, and medium with high. To compare low with medium, run a two-way mixed ANOVA, following the same procedure as above, but entering only two levels for caffeine, namely the low and the medium. Enter sex as abetween-subjects variable as before. In the output you are ONLY interested in the main effect of caffeine, you IGNORE all other effects. Repeat the procedure to test low against high, and then again to compare medium against high.(b) Interaction effect. A significant interaction could be broken down in two ways: (1) The simple effect of caffeine for males and the simple effect of caffeine for females; or (2) the effect of sex for low doses of caffeine, the effect of sex for medium doses, and the effect of sex for high doses.For the first way, g o to the “Data” menu on the main menu bar at the top of the screen. Click on “Select cases”. Click on “If condition is satisfied”. The “If…” below it will then turn from faded to bold. Click on “If…”. Enter “sex = 1” in the dialog box. Click “Continue”, t hen “OK”. Now any further analyses by SPSS will only be on the data where sex = 1; i.e. only for the males. Now if you run a one-way repeated measures ANOVA with caffeine as your independent variable, it will be the simple effect of caffeine for males. Cha nge the selection in “select cases” to females to find the simple effect of caffeine for females.For the second way of breaking down the interaction, perform either an unrelated t-test or a between-subjects one-way ANOVA (as per the first module) using LOWCAFF as your dependent variable and sex as your independent variable. Repeat using MEDCAFF as your dependent variable, and then finally HIGHCAFF.。

认识ISO17511

认识ISO17511

认识ISO 17511的重要性-临床检验结果溯源性冯仁丰为什么质量很重要?•这是我的孙女!–任何人要对她进行检测,你必须得做准确!–对你的家庭每个成员的检测,也应该准确!–我们只有对每个检验都正确地操作,才能实现以上要求!一、什么是临床检验溯源性的根本目的?二、什么是溯源性?后代。

冯仁丰三、现在为什么要求病人样品结果的溯源性?四、为什么将“SI”放在计量溯源性的最高级?五、什么是测量系统或测量程序?六、怎样理解校准品的 计量溯源性?校准品参考系统与常规检测系统对病人新 鲜样品的检测结果,实现了可比性 后,如何使常规系统能持续保持溯 源性,引入校准品作为实现长期溯 源可靠性的“桥梁”。

这就是校准品 的功效。

校准品1.校准品是完成样品检测的检测 系统的一个组分。

在具有良好性能 的检测系统中,校准品的校准值对 检测结果的数量起着重要作用。

检测系统的要素校准品试剂分析仪校准品实现可追溯性中自始至终必需使用 新鲜病人样品。

始终采用方法学比较,为了使病人 样品的结果和参考系统检测结果一 致,调整校准品的值。

参考实验室 参考方法 参考品 有资格的人员病人新鲜样品 具有参考值水平的结果 结果水平公认; 因手工操作、要求高;报告速度 慢。

无法适用于常规。

参考实验室• 只有在国际认可的参考实验室,使用参 考物质或参考方法(程序),对病人样 品的检测,得到的结果才有资格被成为 参考值结果。

• 未经国际认可的任何实验室,即使使用 了参考物质或参考方法,对样品的检测 结果不是参考值。

一级标准标化校准品校准参考方法检测常用方法检测新鲜病人样品获得 获得 可比性良好参考值结果常用方法结果厂商在实现溯源性上的地位• 实现溯源性的根本目的是追求病人结果的 可靠。

• 实验室只有使用具有溯源性的检测系统, 才能使检测结果实现溯源性。

• 因此,厂商在制备产品中,实施质量管 理;在此基础上,按照ISO 17511的要求, 对校准品的定值实现计量溯源性。

南方全站仪文件代码

南方全站仪文件代码

南方全站仪文件代码南方全站仪是一种高精度的测量仪器,广泛应用于土木工程、建筑工程、测绘工程等领域。

它能够实现对地面点的三维坐标测量,具有测量精度高、操作简便等特点。

在使用南方全站仪进行测量时,需要编写一些文件代码来实现测量任务的自动化。

下面将介绍一些常用的南方全站仪文件代码。

1. 连接仪器代码在使用南方全站仪进行测量之前,首先需要与计算机建立连接。

连接仪器的代码如下:```import serialdef connect_instrument(port, baudrate):instrument = serial.Serial(port, baudrate)return instrument```其中,`port`参数为串口号,`baudrate`参数为波特率。

通过调用`serial.Serial()`函数,可以建立与全站仪的连接,并返回一个全站仪对象。

2. 设置测量参数代码在进行测量之前,需要设置一些测量参数,如测量模式、测量精度等。

设置测量参数的代码如下:```def set_measurement_parameters(instrument, mode, accuracy):instrument.write(f"SETMODE {mode}\n".encode())instrument.write(f"SETACCURACY {accuracy}\n".encode())```其中,`mode`参数为测量模式,可以是角度测量模式、距离测量模式等;`accuracy`参数为测量精度,可以是高精度、普通精度等。

通过调用`instrument.write()`函数,可以向全站仪发送设置参数的指令。

3. 进行测量代码设置完测量参数后,就可以进行测量了。

进行测量的代码如下:```def perform_measurement(instrument):instrument.write("STARTMEAS\n".encode())result = instrument.readline().decode().strip()return result```通过调用`instrument.write()`函数,向全站仪发送开始测量的指令。

信令流程——精选推荐

信令流程——精选推荐

信令流程NO.7信令流程⼀、 TUP信令流程A局 B局IAI 初始地址消息ACM 地址全消息ANC 应答计费消息SPEECH PHASECBK 后向释放(被叫挂机)CLF 前向释放RLG 释放监护初始地址消息为IAM,(SAM,SAO)请求主叫号码GRQ送主叫号码 GSM⼆、 ISUP信令流程A局 B局IAM 初始地址消息ACM 地址全消息ANM 应答消息SPEECH PHASEREL 释放消息RLC 释放完全消息三、 MAP信令流程1.位置更新1.11内部位置更新1.12外部位置更新imsi登记TMSI登记以TMSI获取数据未果,向⽤户请求IMSI登记2.移动主叫1. 在服务⼩区内(Cell), 移动⽤户通过随抢(Random Access) ⽅式, 在⽆线通道上请求⼀信道, 以⽤作信令信道.2. 建⽴移动⽤户和 MSC 的信令连接(MSC 和BSC 之间的SCCP 连接). 并且移动⽤户送出服务请求, 其中有⽤户的标识(IMSI 或TMSI) 服务的类型等.3. 对⽤户鉴权. 若是需对⽤户进⾏号码请求, 在鉴权前进⾏. 4,5 若是系统设置加密, 则在此设定加密模式 , 并分配新的TMSI.6. ⽤户送出CALL_SET_UP 请求, 其中有被叫⽤户号码, 呼叫服务类别等, 从⽽真正启动呼叫建⽴过程.7. MSC 分配话务信道. 这是⽆线资源管理命令, MSC 实际上只分配指定了⼀条⾄BSC 的PCM 话路, 然后由BSS 再分配⼀条相应的⾄MS 的⽆线话路.8. MSC 向PSTN 建⽴话路.9. 若被叫⽤户空闲,PSTN 成功建⽴⾄被叫的话路, 在向被叫振铃的同时, 回送ACM 给MSC, MSC 送回铃⾳给主叫移动⽤户.10. 被叫摘机,PSTN 送应答信号给MSC, MSC 送CONNECT ⾄MS. ⾄此话路接通, ⽤户进⼊通话状态.11-14 是呼叫释放的⽰意. 这⾥假设是被叫先挂机:11. PSTN 通知MSC 被叫挂机(CLB 信号),MSC 前向拆除和PSTN 的话路.12. 通知MS 被叫挂机(Disconnect 信号), 释放呼叫.13. 释放PCM 话路和⽆线的话路和信令信道资源.14. 释放SCCP 连接.3.移动被叫加密四、11.1业务见图1-11)收到呼叫,根信息O-CSI触将消息中的IDP消息。

  1. 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
  2. 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
  3. 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。

性能评价技术: 实验-测量,解析,仿真/模拟
• 性能评价技术是指对一个指定的计算机系统进行性能评价时 所采用的方法,通过这种方法求出我们所期望的各项指标。 • 实验/测量 (measurement) 技术:通过测量设备或测量程序 (软件)直接测量计算机系统的各种性能指标,或与之相关 的量,然后由它们经过运算求出相应的性能的指标。 • 模型/建模 (modeling) 技术:对评价的计算机系统建立一个 适当的模型,然后求出模型的性能指标,以便对计算机系统 进行评价,该技术又分为解析技术和仿真技术两种。 • 解析 (analysis) 技术是采用数学分析方法,通过对系统的简 化及解析模型的建立,以求得系统的性能。 • 仿真 (simulation) 技术是采用软件仿真原理,通过构造仿真 模型,详尽、逼真地描述计算机系统。当模型按照系统本身 的方式运行时,对系统的动态行为进行统计,从而得到有关 的性能指标。 • 测量、解析、仿真技术是目前采用的主要性能评价技术,三 者之间相互联系,相互验证,各有优缺点。 • 模拟(Emulation)---- simulation + experiment
0
500
1000 M n i
1500
Passive Measurement Tools
• 利用SNMP tools, 测量 TJU 出口业务流量 (24 小时)
Tr a f f ic in
Bytes/TimUnt 20 30 0 10
0 20
40
60
80
T m i e
U n i =1 t
Workload Generation
Active Measurement: Traceroute
• Traceroute: path and RTT – TTL (Time-To-Live) field in IP packet header • Source sends a packet with a TTL of n • Each router along the path decrements the TTL • “TTL exceeded” sent when TTL reaches 0 – Traceroute tool exploits this TTL behavior • Send packets with increasing TTL values
Active Measurement: Ping
• Adding traffic for purposes of measurement – Send probe packet(s) into the network and measure a response – Trade-offs between accuracy and overhead – Need careful methods to avoid introducing bias • Ping: RTT and connectivity – Host sends an ICMP ECHO packet to a target – … and captures the ICMP ECHO REPLY – Useful for checking connectivity, and RTT – Only requires control of one of the two end-points • Problems with ping – Round-trip rather than one-way delays – Some hosts might not respond
计算机系统/网络性能评价 2008
计算机系统/网络性能评价
• 性能评价的应用: – 研究,设计,运行,改进 – 系统,硬件,软件 ,协议,算法- 计算机,网络,其他 • 改进系统的目的 – 不增加系统配置,但能提高原有系统性能 – 不增加系统配置,但能使系统现有负载能力增加 – 改善系统配置,提高系统性能 • 改进计算机系统性能 的过程 – 定义性能指标 – 建立系统模型(测量,仿真,解析)及负载模型 – 评价性能 (测量,仿真,解析) – 重复以上步骤
Measurement Overview
• Measurement is necessary for understanding current system behavior and how new systems will behave – How, when, where, what do we measure? • Size, complexity and diversity of the Internet makes it very difficult to understand cause-effect relationships • Measurement is meaningless without careful analysis – Analysis of data gathered from networks is quite different from work done in other disciplines • Measurement enables models to be built which can be used to effectively develop and evaluate new techniques – Statistical models – Queuing models – Simulation models
性能指标
• 系统的处理能力 – 吞吐率是指系统在单位时间量,所处理的事务量,作业 个数等等 • 系统的响应能力 – 响应时间:这主要是指系统在交互工作方式下,对终端 用户的响应能力,响时间可以定义为:从用户在终端键 盘上输入最后一个字符的时刻到终端输出第一个字符的 时刻这一时间区间 – 周转时间:主要用在批处理方式,它表示从作业由输入 设备提交给系统到结果在输出设备上出现的这一段时间 – 排队时间:它表示作业在队列中的等待时间和系统对作 业的服务时间两部分的总和。 • 利用率 – 在评价者所考察的一段时间T内,系统的某一部分(硬 件或软件)被使用的时间t与T的比值称为系统这一部 分的利用率
Three delay components:
d : propagatio n delay
L / c : transmissi
on delay
slope=1/c L
: queueing
delay noise
How to infer d,c?
d
Passive Measurement Tools
• Passive tools: Capture data as it passes by – Logging at application level – Packet capture software (tcpdump) uses packet capture filter (bpf,libpcap) • Requires access to the wire • Can have many problems (adds, deletes, reordering) – Flow-based measurement tools – SNMP tools – Routing looking glass sites • Problems – LOTS of data! – Privacy issues – Getting packet scoped in backbone of the network
TTL=1 source TTL=2
Time exceeded
destination
Send packets with TTL=1, 2, 3, … and record source of “time exceeded” message
Active Measurement: Pathchar for Links
Network Performance Characteristics - Performance Metrics
• • • • • • • • • Throughput Latency - Delay Response time Loss Utilization Arrival rate, Departure rate Bandwidth, Capacity Routing (hop) Reliability
---- per-hop capacity, latency, loss
rtt ( i 1) rtt ( i ) d L / c
i : initial TTL value size c : link capacity L : packet
rtt(i+1) -பைடு நூலகம்tt(i)
min. RTT (L)
Active Measurement Tools
• Send probe packet(s) into the network and measure a response – Ping: RTT and loss • Zing: one way Poisson probes – Traceroute: path and RTT – Nettimer: latest bottleneck bandwidth using packet pair method
Passive Measurement Tools
• 利用以太网的广播特性,测量主机所在网段的业务流量 • Tcpdump 测量
以1999年4
例,共有65
WWW请求数
请求的URL
大值为413,
WWW请求数
W W W
r eques t
number 150 20 250 0 50 10
Tn+1 - Tn = max(S/BW, T1 – T0) Size/BW T1 T0 Tn+1 Tn
相关文档
最新文档