Empowering Scientific Discovery

Medcom Radalert100 Multifunctional Geiger-Müller Radiation Detector

Add to wishlistAdded to wishlistRemoved from wishlist 0
Add to compare
Brand Medcom
Origin USA
Model Radalert100
Detection Principle Halogen-quenched Geiger-Müller tube
Energy Range 20 keV – 3 MeV
Dose Rate Range (mR/hr) 0.001 – 110.0
Dose Rate Range (µSv/hr) 0.01 – 1100
Count Rate (CPM) 0 – 350,000
Count Rate (CPS) 0 – 3,500
Total Count Capacity 1 – 9,999,000
Calibration Source ¹³⁷Cs (γ)
Sensitivity 1,000 CPM per mR/hr (referenced to ¹³⁷Cs)
Accuracy ±10% typical, ±15% maximum
Display 4-digit LCD with mode indicators
Audio/Visual Alert User-adjustable threshold, LED flash per count
Interface 3.5 mm output jack (0–9 V, 1 kΩ impedance), 2.5 mm calibration input
Power 9 V alkaline battery (≥700 h typical life)
Dimensions 150 × 80 × 30 mm
Weight 323 g (including battery)
Compliance CE-marked

Overview

The Medcom Radalert100 Multifunctional Geiger-Müller Radiation Detector is a field-deployable, general-purpose radiation survey instrument engineered for reliable detection and quantification of alpha, beta, gamma, and X-ray radiation. Based on a halogen-quenched Geiger-Müller (GM) tube with a 45 mm effective diameter, the device operates across an energy range of 20 keV to 3 MeV—making it suitable for environmental monitoring, occupational safety assessments, educational demonstrations, and emergency response scenarios. Its core measurement principle relies on ionization events within the GM tube gas fill, converted into discrete electrical pulses that are counted and translated into dose rate (mR/hr or µSv/hr) and count rate (CPM/CPS) values. Unlike scintillation- or semiconductor-based systems, the Radalert100 prioritizes robustness, simplicity, and real-time responsiveness over spectral resolution—ideal for rapid screening, boundary surveillance, and baseline radiation trend analysis.

Key Features

  • Real-time dual-unit display: Simultaneous readout in traditional (mR/hr, CPM) and SI units (µSv/hr, CPS) via intuitive 4-digit LCD with dedicated mode indicators
  • Three-second update interval: Optimized for dynamic field conditions while maintaining statistical stability in low-dose environments
  • User-configurable utility menu: Enables adjustment of alarm thresholds, unit preferences, backlight timeout, and auto-zero behavior without external software
  • Dual-interface connectivity: 3.5 mm analog output (0–9 V, 1 kΩ) supports direct connection to data loggers, oscilloscopes, or PC-based acquisition systems; 2.5 mm calibration input allows traceable verification using external reference sources
  • High-sensitivity GM tube: Calibrated against 137Cs (662 keV γ), delivering 1,000 CPM per mR/hr—ensuring consistent response across common isotopic emissions encountered in industrial, medical, and natural settings
  • Low-power architecture: Powered by a single 9 V alkaline battery with ≥700 hours of continuous operation under typical background conditions (≈0.1 µSv/hr), minimizing logistical overhead during extended deployments

Sample Compatibility & Compliance

The Radalert100 is designed for direct exposure measurements of environmental surfaces, air, and unshielded radioactive materials. Its thin mica end-window enables efficient detection of low-penetrating alpha and beta particles—provided source proximity and geometry comply with ANSI N42.33 and IEC 60846-1 requirements for portable radiation monitors. While not intended for spectroscopic identification, the instrument meets CE marking criteria for electromagnetic compatibility (EN 61326-1) and safety (EN 61010-1). It supports routine compliance verification per OSHA 1910.120, NRC Regulatory Guide 8.29, and EPA Method M-111 for area monitoring. For GLP/GMP-aligned workflows, users may pair the analog output with validated third-party data acquisition software to satisfy audit-trail and electronic record retention requirements under 21 CFR Part 11.

Software & Data Management

The Radalert100 operates as a standalone instrument but integrates seamlessly with external data systems via its analog voltage output. When connected to a calibrated analog-to-digital converter or compatible USB interface (e.g., LabJack U3 or National Instruments DAQ devices), raw pulse counts can be logged at user-defined intervals—enabling time-series analysis, dose accumulation tracking, and geotagged survey mapping. Medcom provides optional Windows-compatible configuration software and shielded interface cables for standardized setup. All stored data retains full metrological traceability to the factory calibration performed using 137Cs, with documented uncertainty budgets available upon request. No proprietary firmware updates or cloud services are required—ensuring long-term operational independence and IT security compliance in regulated laboratory or nuclear facility environments.

Applications

  • Personal dosimetry support for radiological workers during decommissioning, NDT operations, or hospital radio-pharmacy handling
  • Perimeter and access-control monitoring around nuclear medicine departments, research reactors, or legacy uranium processing sites
  • Rapid leak detection and contamination mapping following transport incidents or storage failures involving sealed sources
  • Long-term background radiation trending in geological surveys, mining exploration, or post-remediation verification
  • Classroom demonstration of ionizing radiation interactions, inverse-square law validation, and shielding efficacy testing
  • Verification of consumer product compliance (e.g., granite countertops, ceramic glazes, antique radium dials) against IAEA Safety Standards RS-G-1.7 limits

FAQ

What radiation types does the Radalert100 detect?
It detects alpha, beta, gamma, and X-ray radiation—subject to geometric and energetic constraints imposed by the GM tube’s mica window thickness and fill gas composition.

Is the Radalert100 suitable for measuring radon gas?
No. It does not provide integrated radon progeny discrimination or alpha-spectroscopic capability; dedicated radon monitors (e.g., electret ion chambers or pulse-ionization detectors) are required for accurate radon-222 assessment.

Can the device be calibrated in-house?
Yes—via the 2.5 mm calibration input port using a traceable reference source and appropriate attenuators, though periodic third-party verification against national standards (e.g., NIST-traceable 137Cs) is recommended every 12 months.

Does it meet FDA or ISO requirements for medical use?
It is not classified as a medical device under FDA 21 CFR Part 820 or ISO 13485; it is intended for environmental and industrial monitoring—not diagnostic or therapeutic dose verification.

How is accuracy affected at low dose rates?
Statistical uncertainty increases inversely with count rate; below 0.01 µSv/hr, measurement duration should exceed 60 seconds to maintain ±15% confidence per ANSI N42.22 Annex B protocols.

InstrumentHive
Logo
Compare items
  • Total (0)
Compare
0