The ipcs configuration file: -------------------------- For information about the config file, see: IPCS_DISK:[IPCS.NETMAILER.EXE]nm_config.txt The ipcs configuration file is named: on VMS: IPCS_DISK:[IPCS.NETMAILER.EXE]netmailer.cfg on unix: nm_config.dat (probably in /opt/ipcs2/etc/) On VMS, the logical name IPCS$NETCFG *should* be an option to override the default config file, but does not work on Birch (Multinet). I think that it *does* work on the other VMS nodes (UCX), but wouldn't bet on it. Communication between ipcs cells: -------------------------------- A cell is defined in the optional "this_host" line in config file: /cell_netmailer=nstx_pcs in the config file for one host only A cell acts as a single entity, through the cell_netmailer, when talking to other cells. Only the cell_netmailer node knows about other cells to which it might communicate: ga_d3d <--> pppl_res pppl_res <--> nstx_pcs If EVTMGR_NSTX runs on a node in the pppl_res cell, then it can be reached by nodes in both the ga_d3d and nstx_pcs cells. However, if EVTMGR_NSTX runs on nstx_pcs, then it can be reached only from pppl_res. Example config files: -------------------- Unix: nm_config.dat epcisrv1 (cell_netmailer for "nstx_pcs"): this_host=epicsrv1 /cell_netmailer=nstx_pcs /protected nstx_pcs /CONNECTION=WOLLONGONG_TCP /INTERNET=198.35.10.21 /PORT=22375 ppcc /CONNECTION=WOLLONGONG_TCP /INTERNET=198.35.10.102 /PORT=22375 pppl_res /connection=tcp /internet=192.55.106.24 /port=1000 ppcc (member of nstx_pcs cell) this_host=*/protected nstx_pcs /CONNECTION=TCP /INTERNET=198.35.10.21 /PORT=22375 ppcc /CONNECTION=TCP /INTERNET=198.35.10.102 /PORT=22375 VMS: netmailer.cfg birch (cell_netmailer for "pppl_res") this_host=* /cell_netmailer=pppl_res /protected RAX /connection=tcp /internet=192.55.106.12 /port=1000 BEANIE /connection=tcp /internet=192.55.106.22 /port=1000 BIRCH /connection=tcp /internet=192.55.106.24 /port=1000 ga_d3d /CONNECTION=TCP /INTERNET=192.73.62.46 /PORT=22375 ga_usc /CONNECTION=TCP /INTERNET=192.5.166.24 /PORT=22375 llnl_res /CONNECTION=TCP /INTERNET=128.115.15.25 /PORT=22375 nstx_pcs /CONNECTION=TCP /INTERNET=198.35.10.21 /PORT=22375 beanie, rax (member of "pppl_res" cell) this_host=* /protected RAX /connection=tcp /internet=192.55.106.12 /port=1000 BEANIE /connection=tcp /internet=192.55.106.22 /port=1000 pppl_res /connection=tcp /internet=192.55.106.24 /port=1000 An explanatory (?) e-mail: ------------------------- From: BEANIE::GIBNEY To: PSICHTA@PPPL.GOV,RONEY,BDAVIS CC: GIBNEY Subj: ipcs Paul, Phyllis and Bill, Oh happy day -- I think that I now have ipcs communications sorted out. At this point, you should be able to communicate between the "nstx_pcs" cell (epicsrv1, ppcc) and the "pppl_res" cell (birch, beanie, rax). The "pppl_res" cell also knows about cells at GA and LLNL: nstx_pcs <==> pppl_res pppl_res <==> ga_d3d pppl_res <==> llnl_res \ ipcs (EVT) messages from pppl_res can be seen on both sides. EVTMGR_NSTX should run within the nstx_pcs cell. At the moment, it is running on epicsrv1. This way, events can be shared between the nstx_pcs and pppl_res cells, but can not be seen offsite at GA or LLNL. If EVTMGR_NSTX was run on, say, Birch, then events would also be visible at these remote sites: Tom