Changeset d76bea4 in trunk
- Timestamp:
- 2020-12-11T15:32:20Z (4 years ago)
- Branches:
- master
- Children:
- e59a922
- Parents:
- 07e4fe84 (diff), d096cc54 (diff)
Note: this is a merge changeset, the changes displayed below correspond to the merge itself.
Use the (diff) links above to see all the changes relative to each parent. - git-author:
- jehadbaeth <60194206+jehadbaeth@…> (2020-12-11 15:32:20)
- git-committer:
- GitHub <noreply@…> (2020-12-11 15:32:20)
- Files:
-
- 5 added
- 1 deleted
- 34 edited
Legend:
- Unmodified
- Added
- Removed
-
TabularUnified docs/configuration.rst ¶
r07e4fe84 rd76bea4 76 76 ========== 77 77 78 A node can be a client/server , an introducer, or a statistics gatherer.78 A node can be a client/server or an introducer. 79 79 80 80 Client/server nodes provide one or more of the following services: … … 594 594 for uploads. See :doc:`helper` for details. 595 595 596 ``stats_gatherer.furl = (FURL string, optional)``597 598 If provided, the node will connect to the given stats gatherer and599 provide it with operational statistics.600 601 596 ``shares.needed = (int, optional) aka "k", default 3`` 602 597 … … 911 906 This file is used to construct an introducer, and is created by the 912 907 "``tahoe create-introducer``" command. 913 914 ``tahoe-stats-gatherer.tac``915 916 This file is used to construct a statistics gatherer, and is created by the917 "``tahoe create-stats-gatherer``" command.918 908 919 909 ``private/control.furl`` -
TabularUnified docs/man/man1/tahoe.1 ¶
r07e4fe84 rd76bea4 46 46 .B \f[B]create-introducer\f[] 47 47 Create an introducer node. 48 .TP49 .B \f[B]create-stats-gatherer\f[]50 Create a stats-gatherer service.51 48 .SS OPTIONS 52 49 .TP -
TabularUnified docs/stats.rst ¶
r07e4fe84 rd76bea4 7 7 1. `Overview`_ 8 8 2. `Statistics Categories`_ 9 3. `Running a Tahoe Stats-Gatherer Service`_ 10 4. `Using Munin To Graph Stats Values`_ 9 3. `Using Munin To Graph Stats Values`_ 11 10 12 11 Overview … … 244 243 sometimes be negative due to wraparound of the kernel's counter. 245 244 246 **stats.load_monitor.\***247 248 When enabled, the "load monitor" continually schedules a one-second249 callback, and measures how late the response is. This estimates system load250 (if the system is idle, the response should be on time). This is only251 enabled if a stats-gatherer is configured.252 253 avg_load254 average "load" value (seconds late) over the last minute255 256 max_load257 maximum "load" value over the last minute258 259 260 Running a Tahoe Stats-Gatherer Service261 ======================================262 263 The "stats-gatherer" is a simple daemon that periodically collects stats from264 several tahoe nodes. It could be useful, e.g., in a production environment,265 where you want to monitor dozens of storage servers from a central management266 host. It merely gatherers statistics from many nodes into a single place: it267 does not do any actual analysis.268 269 The stats gatherer listens on a network port using the same Foolscap_270 connection library that Tahoe clients use to connect to storage servers.271 Tahoe nodes can be configured to connect to the stats gatherer and publish272 their stats on a periodic basis. (In fact, what happens is that nodes connect273 to the gatherer and offer it a second FURL which points back to the node's274 "stats port", which the gatherer then uses to pull stats on a periodic basis.275 The initial connection is flipped to allow the nodes to live behind NAT276 boxes, as long as the stats-gatherer has a reachable IP address.)277 278 .. _Foolscap: https://foolscap.lothar.com/trac279 280 The stats-gatherer is created in the same fashion as regular tahoe client281 nodes and introducer nodes. Choose a base directory for the gatherer to live282 in (but do not create the directory). Choose the hostname that should be283 advertised in the gatherer's FURL. Then run:284 285 ::286 287 tahoe create-stats-gatherer --hostname=HOSTNAME $BASEDIR288 289 and start it with "tahoe start $BASEDIR". Once running, the gatherer will290 write a FURL into $BASEDIR/stats_gatherer.furl .291 292 To configure a Tahoe client/server node to contact the stats gatherer, copy293 this FURL into the node's tahoe.cfg file, in a section named "[client]",294 under a key named "stats_gatherer.furl", like so:295 296 ::297 298 [client]299 stats_gatherer.furl = pb://qbo4ktl667zmtiuou6lwbjryli2brv6t@HOSTNAME:PORTNUM/wxycb4kaexzskubjnauxeoptympyf45y300 301 or simply copy the stats_gatherer.furl file into the node's base directory302 (next to the tahoe.cfg file): it will be interpreted in the same way.303 304 When the gatherer is created, it will allocate a random unused TCP port, so305 it should not conflict with anything else that you have running on that host306 at that time. To explicitly control which port it uses, run the creation307 command with ``--location=`` and ``--port=`` instead of ``--hostname=``. If308 you use a hostname of ``example.org`` and a port number of ``1234``, then309 run::310 311 tahoe create-stats-gatherer --location=tcp:example.org:1234 --port=tcp:1234312 313 ``--location=`` is a Foolscap FURL hints string (so it can be a314 comma-separated list of connection hints), and ``--port=`` is a Twisted315 "server endpoint specification string", as described in :doc:`configuration`.316 317 Once running, the stats gatherer will create a standard JSON file in318 ``$BASEDIR/stats.json``. Once a minute, the gatherer will pull stats319 information from every connected node and write them into the file. The file320 will contain a dictionary, in which node identifiers (known as "tubid"321 strings) are the keys, and the values are a dict with 'timestamp',322 'nickname', and 'stats' keys. d[tubid][stats] will contain the stats323 dictionary as made available at http://localhost:3456/statistics?t=json . The324 file will only contain the most recent update from each node.325 326 Other tools can be built to examine these stats and render them into327 something useful. For example, a tool could sum the328 "storage_server.disk_avail' values from all servers to compute a329 total-disk-available number for the entire grid (however, the "disk watcher"330 daemon, in misc/operations_helpers/spacetime/, is better suited for this331 specific task).332 245 333 246 Using Munin To Graph Stats Values -
TabularUnified src/allmydata/client.py ¶
r07e4fe84 rd76bea4 4 4 from base64 import urlsafe_b64encode 5 5 from functools import partial 6 7 6 # On Python 2 this will be the backported package: 8 7 from configparser import NoSectionError … … 86 85 "shares.needed", 87 86 "shares.total", 88 "stats_gatherer.furl",89 87 "storage.plugins", 90 88 ), … … 679 677 680 678 def init_stats_provider(self): 681 gatherer_furl = self.config.get_config("client", "stats_gatherer.furl", None) 682 if gatherer_furl: 683 # FURLs should be bytes: 684 gatherer_furl = gatherer_furl.encode("utf-8") 685 self.stats_provider = StatsProvider(self, gatherer_furl) 679 self.stats_provider = StatsProvider(self) 686 680 self.stats_provider.setServiceParent(self) 687 681 self.stats_provider.register_producer(self) -
TabularUnified src/allmydata/immutable/checker.py ¶
r07e4fe84 rd76bea4 1 """ 2 Ported to Python 3. 3 """ 4 from __future__ import absolute_import 5 from __future__ import division 6 from __future__ import print_function 7 from __future__ import unicode_literals 8 9 from future.utils import PY2 10 if PY2: 11 from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401 12 1 13 from zope.interface import implementer 2 14 from twisted.internet import defer -
TabularUnified src/allmydata/immutable/repairer.py ¶
r07e4fe84 rd76bea4 1 """ 2 Ported to Python 3. 3 """ 4 from __future__ import absolute_import 5 from __future__ import division 6 from __future__ import print_function 7 from __future__ import unicode_literals 8 9 from future.utils import PY2 10 if PY2: 11 from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401 12 1 13 from zope.interface import implementer 2 14 from twisted.internet import defer -
TabularUnified src/allmydata/interfaces.py ¶
r07e4fe84 rd76bea4 2932 2932 2933 2933 2934 class RIStatsProvider(RemoteInterface):2935 __remote_name__ = native_str("RIStatsProvider.tahoe.allmydata.com")2936 """2937 Provides access to statistics and monitoring information.2938 """2939 2940 def get_stats():2941 """2942 returns a dictionary containing 'counters' and 'stats', each a2943 dictionary with string counter/stat name keys, and numeric or None values.2944 counters are monotonically increasing measures of work done, and2945 stats are instantaneous measures (potentially time averaged2946 internally)2947 """2948 return DictOf(bytes, DictOf(bytes, ChoiceOf(float, int, long, None)))2949 2950 2951 class RIStatsGatherer(RemoteInterface):2952 __remote_name__ = native_str("RIStatsGatherer.tahoe.allmydata.com")2953 """2954 Provides a monitoring service for centralised collection of stats2955 """2956 2957 def provide(provider=RIStatsProvider, nickname=bytes):2958 """2959 @param provider: a stats collector instance that should be polled2960 periodically by the gatherer to collect stats.2961 @param nickname: a name useful to identify the provided client2962 """2963 return None2964 2965 2966 2934 class IStatsProducer(Interface): 2967 2935 def get_stats(): -
TabularUnified src/allmydata/introducer/client.py ¶
r07e4fe84 rd76bea4 1 from past.builtins import unicode, long 1 """ 2 Ported to Python 3. 3 """ 4 from __future__ import absolute_import 5 from __future__ import division 6 from __future__ import print_function 7 from __future__ import unicode_literals 8 9 from future.utils import PY2 10 if PY2: 11 from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401 12 from past.builtins import long 13 2 14 from six import ensure_text 3 15 … … 28 40 sequencer, cache_filepath): 29 41 self._tub = tub 30 if isinstance(introducer_furl, unicode):42 if isinstance(introducer_furl, str): 31 43 introducer_furl = introducer_furl.encode("utf-8") 32 44 self.introducer_furl = introducer_furl 33 45 34 assert type(nickname) is unicode46 assert isinstance(nickname, str) 35 47 self._nickname = nickname 36 48 self._my_version = my_version … … 115 127 def _save_announcements(self): 116 128 announcements = [] 117 for _, value in self._inbound_announcements.items():129 for value in self._inbound_announcements.values(): 118 130 ann, key_s, time_stamp = value 119 131 # On Python 2, bytes strings are encoded into YAML Unicode strings. … … 126 138 announcements.append(server_params) 127 139 announcement_cache_yaml = yamlutil.safe_dump(announcements) 128 if isinstance(announcement_cache_yaml, unicode):140 if isinstance(announcement_cache_yaml, str): 129 141 announcement_cache_yaml = announcement_cache_yaml.encode("utf-8") 130 142 self._cache_filepath.setContent(announcement_cache_yaml) … … 171 183 self._subscribed_service_names.add(service_name) 172 184 self._maybe_subscribe() 173 for index,(ann,key_s,when) in self._inbound_announcements.items():185 for index,(ann,key_s,when) in list(self._inbound_announcements.items()): 174 186 precondition(isinstance(key_s, bytes), key_s) 175 187 servicename = index[0] … … 216 228 217 229 # publish all announcements with the new seqnum and nonce 218 for service_name,ann_d in self._outbound_announcements.items():230 for service_name,ann_d in list(self._outbound_announcements.items()): 219 231 ann_d["seqnum"] = current_seqnum 220 232 ann_d["nonce"] = current_nonce … … 228 240 return 229 241 # this re-publishes everything. The Introducer ignores duplicates 230 for ann_t in self._published_announcements.values():242 for ann_t in list(self._published_announcements.values()): 231 243 self._debug_counts["outbound_message"] += 1 232 244 self._debug_outstanding += 1 … … 268 280 # for ASCII values, simplejson might give us unicode *or* bytes 269 281 if "nickname" in ann and isinstance(ann["nickname"], bytes): 270 ann["nickname"] = unicode(ann["nickname"])282 ann["nickname"] = str(ann["nickname"]) 271 283 nick_s = ann.get("nickname",u"").encode("utf-8") 272 284 lp2 = self.log(format="announcement for nickname '%(nick)s', service=%(svc)s: %(ann)s", -
TabularUnified src/allmydata/introducer/common.py ¶
r07e4fe84 rd76bea4 1 from past.builtins import unicode 1 """ 2 Ported to Python 3. 3 """ 4 from __future__ import absolute_import 5 from __future__ import division 6 from __future__ import print_function 7 from __future__ import unicode_literals 8 9 from future.utils import PY2 10 if PY2: 11 from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401 2 12 3 13 import re … … 9 19 def get_tubid_string_from_ann(ann): 10 20 furl = ann.get("anonymous-storage-FURL") or ann.get("FURL") 11 if isinstance(furl, unicode):12 furl = furl.encode("utf-8")13 21 return get_tubid_string(furl) 14 22 15 23 def get_tubid_string(furl): 16 m = re.match( br'pb://(\w+)@', furl)24 m = re.match(r'pb://(\w+)@', furl) 17 25 assert m 18 return m.group(1).lower() 26 return m.group(1).lower().encode("ascii") 19 27 20 28 -
TabularUnified src/allmydata/introducer/server.py ¶
r07e4fe84 rd76bea4 1 """ 2 Ported to Python 3. 3 """ 4 5 from __future__ import absolute_import 6 from __future__ import division 7 from __future__ import print_function 8 from __future__ import unicode_literals 9 10 11 from future.utils import PY2 12 if PY2: 13 from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401 1 14 from past.builtins import long 2 from six import ensure_ str, ensure_text15 from six import ensure_text 3 16 4 17 import time, os.path, textwrap … … 158 171 # expected keys are: version, nickname, app-versions, my-version, 159 172 # oldest-supported 160 self._subscribers = {}173 self._subscribers = dictutil.UnicodeKeyDict({}) 161 174 162 175 self._debug_counts = {"inbound_message": 0, … … 182 195 """Return a list of AnnouncementDescriptor for all announcements""" 183 196 announcements = [] 184 for (index, (_, canary, ann, when)) in self._announcements.items():197 for (index, (_, canary, ann, when)) in list(self._announcements.items()): 185 198 ad = AnnouncementDescriptor(when, index, canary, ann) 186 199 announcements.append(ad) … … 190 203 """Return a list of SubscriberDescriptor objects for all subscribers""" 191 204 s = [] 192 for service_name, subscriptions in self._subscribers.items():193 for rref,(subscriber_info,when) in subscriptions.items():205 for service_name, subscriptions in list(self._subscribers.items()): 206 for rref,(subscriber_info,when) in list(subscriptions.items()): 194 207 # note that if the subscriber didn't do Tub.setLocation, 195 208 # tubid will be None. Also, subscribers do not tell us which … … 282 295 self.log("introducer: subscription[%s] request at %s" 283 296 % (service_name, subscriber), umid="U3uzLg") 284 service_name = ensure_ str(service_name)297 service_name = ensure_text(service_name) 285 298 subscriber_info = dictutil.UnicodeKeyDict({ 286 299 ensure_text(k): v for (k, v) in subscriber_info.items() … … 308 321 subscriber.notifyOnDisconnect(_remove) 309 322 323 # Make sure types are correct: 324 for k in self._announcements: 325 assert isinstance(k[0], type(service_name)) 326 310 327 # now tell them about any announcements they're interested in 311 assert {type(service_name)}.issuperset(312 set(type(k[0]) for k in self._announcements)), (313 service_name, self._announcements.keys()314 )315 328 announcements = set( [ ann_t 316 329 for idx,(ann_t,canary,ann,when) -
TabularUnified src/allmydata/scripts/create_node.py ¶
r07e4fe84 rd76bea4 319 319 c.write("[client]\n") 320 320 c.write("helper.furl =\n") 321 c.write("#stats_gatherer.furl =\n")322 321 c.write("\n") 323 322 c.write("# Encoding parameters this client will use for newly-uploaded files\n") -
TabularUnified src/allmydata/scripts/run_common.py ¶
r07e4fe84 rd76bea4 11 11 from allmydata.scripts.default_nodedir import _default_nodedir 12 12 from allmydata.util import fileutil 13 from allmydata.node import read_config14 13 from allmydata.util.encodingutil import listdir_unicode, quote_local_unicode_path 15 14 from allmydata.util.configutil import UnknownConfigError … … 48 47 def identify_node_type(basedir): 49 48 """ 50 :return unicode: None or one of: 'client', 'introducer', 51 'key-generator' or 'stats-gatherer'49 :return unicode: None or one of: 'client', 'introducer', or 50 'key-generator' 52 51 """ 53 52 tac = u'' … … 60 59 return None 61 60 62 for t in (u"client", u"introducer", u"key-generator" , u"stats-gatherer"):61 for t in (u"client", u"introducer", u"key-generator"): 63 62 if t in tac: 64 63 return t … … 136 135 u"client": lambda: maybeDeferred(namedAny("allmydata.client.create_client"), self.basedir), 137 136 u"introducer": lambda: maybeDeferred(namedAny("allmydata.introducer.server.create_introducer"), self.basedir), 138 u"stats-gatherer": lambda: maybeDeferred(namedAny("allmydata.stats.StatsGathererService"), read_config(self.basedir, None), self.basedir, verbose=True),139 137 u"key-generator": key_generator_removed, 140 138 } -
TabularUnified src/allmydata/scripts/runner.py ¶
r07e4fe84 rd76bea4 10 10 from allmydata.scripts.common import get_default_nodedir 11 11 from allmydata.scripts import debug, create_node, cli, \ 12 stats_gatherer,admin, tahoe_daemonize, tahoe_start, \12 admin, tahoe_daemonize, tahoe_start, \ 13 13 tahoe_stop, tahoe_restart, tahoe_run, tahoe_invite 14 14 from allmydata.util.encodingutil import quote_output, quote_local_unicode_path, get_io_encoding … … 61 61 62 62 subCommands = ( create_node.subCommands 63 + stats_gatherer.subCommands64 63 + admin.subCommands 65 64 + process_control_commands … … 108 107 109 108 create_dispatch = {} 110 for module in (create_node, stats_gatherer):109 for module in (create_node,): 111 110 create_dispatch.update(module.dispatch) 112 111 -
TabularUnified src/allmydata/scripts/tahoe_add_alias.py ¶
r07e4fe84 rd76bea4 1 1 from __future__ import print_function 2 from __future__ import unicode_literals 2 3 3 4 import os.path … … 11 12 from allmydata.scripts.common import get_aliases 12 13 from allmydata.util.fileutil import move_into_place 13 from allmydata.util.encodingutil import unicode_to_output, quote_output14 from allmydata.util.encodingutil import quote_output, quote_output_u 14 15 15 16 … … 49 50 old_aliases = get_aliases(nodedir) 50 51 if alias in old_aliases: 51 print("Alias %s already exists!" % quote_output(alias), file=stderr)52 show_output(stderr, "Alias {alias} already exists!", alias=alias) 52 53 return 1 53 54 aliasfile = os.path.join(nodedir, "private", "aliases") … … 55 56 56 57 add_line_to_aliasfile(aliasfile, alias, cap) 57 58 print("Alias %s added" % quote_output(alias), file=stdout) 58 show_output(stdout, "Alias {alias} added", alias=alias) 59 59 return 0 60 60 … … 76 76 old_aliases = get_aliases(nodedir) 77 77 if alias in old_aliases: 78 print("Alias %s already exists!" % quote_output(alias), file=stderr)78 show_output(stderr, "Alias {alias} already exists!", alias=alias) 79 79 return 1 80 80 … … 94 94 95 95 add_line_to_aliasfile(aliasfile, alias, new_uri) 96 show_output(stdout, "Alias {alias} created", alias=alias) 97 return 0 96 98 97 print("Alias %s created" % (quote_output(alias),), file=stdout) 98 return 0 99 100 def show_output(fp, template, **kwargs): 101 """ 102 Print to just about anything. 103 104 :param fp: A file-like object to which to print. This handles the case 105 where ``fp`` declares a support encoding with the ``encoding`` 106 attribute (eg sys.stdout on Python 3). It handles the case where 107 ``fp`` declares no supported encoding via ``None`` for its 108 ``encoding`` attribute (eg sys.stdout on Python 2 when stdout is not a 109 tty). It handles the case where ``fp`` declares an encoding that does 110 not support all of the characters in the output by forcing the 111 "namereplace" error handler. It handles the case where there is no 112 ``encoding`` attribute at all (eg StringIO.StringIO) by writing 113 utf-8-encoded bytes. 114 """ 115 assert isinstance(template, unicode) 116 117 # On Python 3 fp has an encoding attribute under all real usage. On 118 # Python 2, the encoding attribute is None if stdio is not a tty. The 119 # test suite often passes StringIO which has no such attribute. Make 120 # allowances for this until the test suite is fixed and Python 2 is no 121 # more. 122 try: 123 encoding = fp.encoding or "utf-8" 124 except AttributeError: 125 has_encoding = False 126 encoding = "utf-8" 127 else: 128 has_encoding = True 129 130 output = template.format(**{ 131 k: quote_output_u(v, encoding=encoding) 132 for (k, v) 133 in kwargs.items() 134 }) 135 safe_output = output.encode(encoding, "namereplace") 136 if has_encoding: 137 safe_output = safe_output.decode(encoding) 138 print(safe_output, file=fp) 99 139 100 140 … … 112 152 113 153 154 def _escape_format(t): 155 """ 156 _escape_format(t).format() == t 157 158 :param unicode t: The text to escape. 159 """ 160 return t.replace("{", "{{").replace("}", "}}") 161 162 114 163 def list_aliases(options): 115 nodedir = options['node-directory'] 116 stdout = options.stdout 117 stderr = options.stderr 118 119 data = _get_alias_details(nodedir) 120 121 max_width = max([len(quote_output(name)) for name in data.keys()] + [0]) 122 fmt = "%" + str(max_width) + "s: %s" 123 rc = 0 164 """ 165 Show aliases that exist. 166 """ 167 data = _get_alias_details(options['node-directory']) 124 168 125 169 if options['json']: 126 try: 127 # XXX why are we presuming utf-8 output? 128 print(json.dumps(data, indent=4).decode('utf-8'), file=stdout) 129 except (UnicodeEncodeError, UnicodeDecodeError): 130 print(json.dumps(data, indent=4), file=stderr) 131 rc = 1 170 output = _escape_format(json.dumps(data, indent=4).decode("ascii")) 132 171 else: 133 for name, details in data.items(): 134 dircap = details['readonly'] if options['readonly-uri'] else details['readwrite'] 135 try: 136 print(fmt % (unicode_to_output(name), unicode_to_output(dircap.decode('utf-8'))), file=stdout) 137 except (UnicodeEncodeError, UnicodeDecodeError): 138 print(fmt % (quote_output(name), quote_output(dircap)), file=stderr) 139 rc = 1 172 def dircap(details): 173 return ( 174 details['readonly'] 175 if options['readonly-uri'] 176 else details['readwrite'] 177 ).decode("utf-8") 140 178 141 if rc == 1: 142 print("\nThis listing included aliases or caps that could not be converted to the terminal" \ 143 "\noutput encoding. These are shown using backslash escapes and in quotes.", file=stderr) 144 return rc 179 def format_dircap(name, details): 180 return fmt % (name, dircap(details)) 181 182 max_width = max([len(quote_output(name)) for name in data.keys()] + [0]) 183 fmt = "%" + str(max_width) + "s: %s" 184 output = "\n".join(list( 185 format_dircap(name, details) 186 for name, details 187 in data.items() 188 )) 189 190 if output: 191 # Show whatever we computed. Skip this if there is no output to avoid 192 # a spurious blank line. 193 show_output(options.stdout, output) 194 195 return 0 -
TabularUnified src/allmydata/stats.py ¶
r07e4fe84 rd76bea4 1 1 from __future__ import print_function 2 2 3 import json4 import os5 import pprint6 3 import time 7 from collections import deque8 4 9 5 # Python 2 compatibility … … 12 8 from future.builtins import str # noqa: F401 13 9 14 from twisted.internet import reactor15 10 from twisted.application import service 16 11 from twisted.application.internet import TimerService 17 12 from zope.interface import implementer 18 from foolscap.api import eventually , DeadReferenceError, Referenceable, Tub13 from foolscap.api import eventually 19 14 20 15 from allmydata.util import log 21 from allmydata.util.encodingutil import quote_local_unicode_path 22 from allmydata.interfaces import RIStatsProvider, RIStatsGatherer, IStatsProducer 23 24 @implementer(IStatsProducer) 25 class LoadMonitor(service.MultiService): 26 27 loop_interval = 1 28 num_samples = 60 29 30 def __init__(self, provider, warn_if_delay_exceeds=1): 31 service.MultiService.__init__(self) 32 self.provider = provider 33 self.warn_if_delay_exceeds = warn_if_delay_exceeds 34 self.started = False 35 self.last = None 36 self.stats = deque() 37 self.timer = None 38 39 def startService(self): 40 if not self.started: 41 self.started = True 42 self.timer = reactor.callLater(self.loop_interval, self.loop) 43 service.MultiService.startService(self) 44 45 def stopService(self): 46 self.started = False 47 if self.timer: 48 self.timer.cancel() 49 self.timer = None 50 return service.MultiService.stopService(self) 51 52 def loop(self): 53 self.timer = None 54 if not self.started: 55 return 56 now = time.time() 57 if self.last is not None: 58 delay = now - self.last - self.loop_interval 59 if delay > self.warn_if_delay_exceeds: 60 log.msg(format='excessive reactor delay (%ss)', args=(delay,), 61 level=log.UNUSUAL) 62 self.stats.append(delay) 63 while len(self.stats) > self.num_samples: 64 self.stats.popleft() 65 66 self.last = now 67 self.timer = reactor.callLater(self.loop_interval, self.loop) 68 69 def get_stats(self): 70 if self.stats: 71 avg = sum(self.stats) / len(self.stats) 72 m_x = max(self.stats) 73 else: 74 avg = m_x = 0 75 return { 'load_monitor.avg_load': avg, 76 'load_monitor.max_load': m_x, } 16 from allmydata.interfaces import IStatsProducer 77 17 78 18 @implementer(IStatsProducer) … … 129 69 130 70 131 @implementer(RIStatsProvider) 132 class StatsProvider(Referenceable, service.MultiService): 71 class StatsProvider(service.MultiService): 133 72 134 def __init__(self, node , gatherer_furl):73 def __init__(self, node): 135 74 service.MultiService.__init__(self) 136 75 self.node = node 137 self.gatherer_furl = gatherer_furl # might be None138 76 139 77 self.counters = {} 140 78 self.stats_producers = [] 141 142 # only run the LoadMonitor (which submits a timer every second) if143 # there is a gatherer who is going to be paying attention. Our stats144 # are visible through HTTP even without a gatherer, so run the rest145 # of the stats (including the once-per-minute CPUUsageMonitor)146 if gatherer_furl:147 self.load_monitor = LoadMonitor(self)148 self.load_monitor.setServiceParent(self)149 self.register_producer(self.load_monitor)150 151 79 self.cpu_monitor = CPUUsageMonitor() 152 80 self.cpu_monitor.setServiceParent(self) 153 81 self.register_producer(self.cpu_monitor) 154 155 def startService(self):156 if self.node and self.gatherer_furl:157 nickname_utf8 = self.node.nickname.encode("utf-8")158 self.node.tub.connectTo(self.gatherer_furl,159 self._connected, nickname_utf8)160 service.MultiService.startService(self)161 82 162 83 def count(self, name, delta=1): … … 176 97 log.msg(format='get_stats() -> %(stats)s', stats=ret, level=log.NOISY) 177 98 return ret 178 179 def remote_get_stats(self):180 # The remote API expects keys to be bytes:181 def to_bytes(d):182 result = {}183 for (k, v) in d.items():184 if isinstance(k, str):185 k = k.encode("utf-8")186 result[k] = v187 return result188 189 stats = self.get_stats()190 return {b"counters": to_bytes(stats["counters"]),191 b"stats": to_bytes(stats["stats"])}192 193 def _connected(self, gatherer, nickname):194 gatherer.callRemoteOnly('provide', self, nickname or '')195 196 197 @implementer(RIStatsGatherer)198 class StatsGatherer(Referenceable, service.MultiService):199 200 poll_interval = 60201 202 def __init__(self, basedir):203 service.MultiService.__init__(self)204 self.basedir = basedir205 206 self.clients = {}207 self.nicknames = {}208 209 self.timer = TimerService(self.poll_interval, self.poll)210 self.timer.setServiceParent(self)211 212 def get_tubid(self, rref):213 return rref.getRemoteTubID()214 215 def remote_provide(self, provider, nickname):216 tubid = self.get_tubid(provider)217 if tubid == '<unauth>':218 print("WARNING: failed to get tubid for %s (%s)" % (provider, nickname))219 # don't add to clients to poll (polluting data) don't care about disconnect220 return221 self.clients[tubid] = provider222 self.nicknames[tubid] = nickname223 224 def poll(self):225 for tubid,client in self.clients.items():226 nickname = self.nicknames.get(tubid)227 d = client.callRemote('get_stats')228 d.addCallbacks(self.got_stats, self.lost_client,229 callbackArgs=(tubid, nickname),230 errbackArgs=(tubid,))231 d.addErrback(self.log_client_error, tubid)232 233 def lost_client(self, f, tubid):234 # this is called lazily, when a get_stats request fails235 del self.clients[tubid]236 del self.nicknames[tubid]237 f.trap(DeadReferenceError)238 239 def log_client_error(self, f, tubid):240 log.msg("StatsGatherer: error in get_stats(), peerid=%s" % tubid,241 level=log.UNUSUAL, failure=f)242 243 def got_stats(self, stats, tubid, nickname):244 raise NotImplementedError()245 246 class StdOutStatsGatherer(StatsGatherer):247 verbose = True248 def remote_provide(self, provider, nickname):249 tubid = self.get_tubid(provider)250 if self.verbose:251 print('connect "%s" [%s]' % (nickname, tubid))252 provider.notifyOnDisconnect(self.announce_lost_client, tubid)253 StatsGatherer.remote_provide(self, provider, nickname)254 255 def announce_lost_client(self, tubid):256 print('disconnect "%s" [%s]' % (self.nicknames[tubid], tubid))257 258 def got_stats(self, stats, tubid, nickname):259 print('"%s" [%s]:' % (nickname, tubid))260 pprint.pprint(stats)261 262 class JSONStatsGatherer(StdOutStatsGatherer):263 # inherit from StdOutStatsGatherer for connect/disconnect notifications264 265 def __init__(self, basedir=u".", verbose=True):266 self.verbose = verbose267 StatsGatherer.__init__(self, basedir)268 self.jsonfile = os.path.join(basedir, "stats.json")269 270 if os.path.exists(self.jsonfile):271 try:272 with open(self.jsonfile, 'rb') as f:273 self.gathered_stats = json.load(f)274 except Exception:275 print("Error while attempting to load stats file %s.\n"276 "You may need to restore this file from a backup,"277 " or delete it if no backup is available.\n" %278 quote_local_unicode_path(self.jsonfile))279 raise280 else:281 self.gathered_stats = {}282 283 def got_stats(self, stats, tubid, nickname):284 s = self.gathered_stats.setdefault(tubid, {})285 s['timestamp'] = time.time()286 s['nickname'] = nickname287 s['stats'] = stats288 self.dump_json()289 290 def dump_json(self):291 tmp = "%s.tmp" % (self.jsonfile,)292 with open(tmp, 'wb') as f:293 json.dump(self.gathered_stats, f)294 if os.path.exists(self.jsonfile):295 os.unlink(self.jsonfile)296 os.rename(tmp, self.jsonfile)297 298 class StatsGathererService(service.MultiService):299 furl_file = "stats_gatherer.furl"300 301 def __init__(self, basedir=".", verbose=False):302 service.MultiService.__init__(self)303 self.basedir = basedir304 self.tub = Tub(certFile=os.path.join(self.basedir,305 "stats_gatherer.pem"))306 self.tub.setServiceParent(self)307 self.tub.setOption("logLocalFailures", True)308 self.tub.setOption("logRemoteFailures", True)309 self.tub.setOption("expose-remote-exception-types", False)310 311 self.stats_gatherer = JSONStatsGatherer(self.basedir, verbose)312 self.stats_gatherer.setServiceParent(self)313 314 try:315 with open(os.path.join(self.basedir, "location")) as f:316 location = f.read().strip()317 except EnvironmentError:318 raise ValueError("Unable to find 'location' in BASEDIR, please rebuild your stats-gatherer")319 try:320 with open(os.path.join(self.basedir, "port")) as f:321 port = f.read().strip()322 except EnvironmentError:323 raise ValueError("Unable to find 'port' in BASEDIR, please rebuild your stats-gatherer")324 325 self.tub.listenOn(port)326 self.tub.setLocation(location)327 ff = os.path.join(self.basedir, self.furl_file)328 self.gatherer_furl = self.tub.registerReference(self.stats_gatherer,329 furlFile=ff) -
TabularUnified src/allmydata/storage_client.py ¶
r07e4fe84 rd76bea4 562 562 563 563 *nickname* is optional. 564 """ 564 565 The furl will be a Unicode string on Python 3; on Python 2 it will be 566 either a native (bytes) string or a Unicode string. 567 """ 568 furl = furl.encode("utf-8") 565 569 m = re.match(br'pb://(\w+)@', furl) 566 570 assert m, furl … … 757 761 return _FoolscapStorage.from_announcement( 758 762 self._server_id, 759 furl .encode("utf-8"),763 furl, 760 764 ann, 761 765 storage_server, … … 769 773 pass 770 774 else: 771 if isinstance(furl, str):772 furl = furl.encode("utf-8")773 775 # See comment above for the _storage_from_foolscap_plugin case 774 776 # about passing in get_rref. -
TabularUnified src/allmydata/test/cli/common.py ¶
r07e4fe84 rd76bea4 1 1 from ...util.encodingutil import unicode_to_argv 2 2 from ...scripts import runner 3 from ..common_util import ReallyEqualMixin, run_cli 3 from ..common_util import ReallyEqualMixin, run_cli, run_cli_unicode 4 4 5 5 def parse_options(basedir, command, args): … … 11 11 12 12 class CLITestMixin(ReallyEqualMixin): 13 def do_cli(self, verb, *args, **kwargs): 13 """ 14 A mixin for use with ``GridTestMixin`` to execute CLI commands against 15 nodes created by methods of that mixin. 16 """ 17 def do_cli_unicode(self, verb, argv, client_num=0, **kwargs): 18 """ 19 Run a Tahoe-LAFS CLI command. 20 21 :param verb: See ``run_cli_unicode``. 22 23 :param argv: See ``run_cli_unicode``. 24 25 :param int client_num: The number of the ``GridTestMixin``-created 26 node against which to execute the command. 27 28 :param kwargs: Additional keyword arguments to pass to 29 ``run_cli_unicode``. 30 """ 14 31 # client_num is used to execute client CLI commands on a specific 15 32 # client. 16 client_num = kwargs.get("client_num", 0) 33 client_dir = self.get_clientdir(i=client_num) 34 nodeargs = [ u"--node-directory", client_dir ] 35 return run_cli_unicode(verb, argv, nodeargs=nodeargs, **kwargs) 36 37 38 def do_cli(self, verb, *args, **kwargs): 39 """ 40 Like ``do_cli_unicode`` but work with ``bytes`` everywhere instead of 41 ``unicode``. 42 43 Where possible, prefer ``do_cli_unicode``. 44 """ 45 # client_num is used to execute client CLI commands on a specific 46 # client. 47 client_num = kwargs.pop("client_num", 0) 17 48 client_dir = unicode_to_argv(self.get_clientdir(i=client_num)) 18 nodeargs = [ "--node-directory", client_dir ]19 return run_cli(verb, nodeargs=nodeargs, *args, **kwargs)49 nodeargs = [ b"--node-directory", client_dir ] 50 return run_cli(verb, *args, nodeargs=nodeargs, **kwargs) -
TabularUnified src/allmydata/test/cli/test_alias.py ¶
r07e4fe84 rd76bea4 1 1 import json 2 from mock import patch3 2 4 3 from twisted.trial import unittest 5 4 from twisted.internet.defer import inlineCallbacks 6 5 7 from allmydata.util.encodingutil import unicode_to_argv8 6 from allmydata.scripts.common import get_aliases 9 7 from allmydata.test.no_network import GridTestMixin 10 8 from .common import CLITestMixin 11 from ..common_util import skip_if_cannot_represent_argv9 from allmydata.util import encodingutil 12 10 13 11 # see also test_create_alias … … 16 14 17 15 @inlineCallbacks 18 def test_list(self): 19 self.basedir = "cli/ListAlias/test_list" 16 def _check_create_alias(self, alias, encoding): 17 """ 18 Verify that ``tahoe create-alias`` can be used to create an alias named 19 ``alias`` when argv is encoded using ``encoding``. 20 21 :param unicode alias: The alias to try to create. 22 23 :param NoneType|str encoding: The name of an encoding to force the 24 ``create-alias`` implementation to use. This simulates the 25 effects of setting LANG and doing other locale-foolishness without 26 actually having to mess with this process's global locale state. 27 If this is ``None`` then the encoding used will be ascii but the 28 stdio objects given to the code under test will not declare any 29 encoding (this is like Python 2 when stdio is not a tty). 30 31 :return Deferred: A Deferred that fires with success if the alias can 32 be created and that creation is reported on stdout appropriately 33 encoded or with failure if something goes wrong. 34 """ 35 self.basedir = self.mktemp() 20 36 self.set_up_grid(oneshare=True) 21 37 22 rc, stdout, stderr = yield self.do_cli( 23 "create-alias", 24 unicode_to_argv(u"tahoe"), 38 # We can pass an encoding into the test utilities to invoke the code 39 # under test but we can't pass such a parameter directly to the code 40 # under test. Instead, that code looks at io_encoding. So, 41 # monkey-patch that value to our desired value here. This is the code 42 # that most directly takes the place of messing with LANG or the 43 # locale module. 44 self.patch(encodingutil, "io_encoding", encoding or "ascii") 45 46 rc, stdout, stderr = yield self.do_cli_unicode( 47 u"create-alias", 48 [alias], 49 encoding=encoding, 25 50 ) 26 51 27 self.failUnless(unicode_to_argv(u"Alias 'tahoe' created") in stdout) 28 self.failIf(stderr) 52 # Make sure the result of the create-alias command is as we want it to 53 # be. 54 self.assertEqual(u"Alias '{}' created\n".format(alias), stdout) 55 self.assertEqual("", stderr) 56 self.assertEqual(0, rc) 57 58 # Make sure it had the intended side-effect, too - an alias created in 59 # the node filesystem state. 29 60 aliases = get_aliases(self.get_clientdir()) 30 self. failUnless(u"tahoe" inaliases)31 self. failUnless(aliases[u"tahoe"].startswith("URI:DIR2:"))61 self.assertIn(alias, aliases) 62 self.assertTrue(aliases[alias].startswith(u"URI:DIR2:")) 32 63 33 rc, stdout, stderr = yield self.do_cli("list-aliases", "--json") 64 # And inspect the state via the user interface list-aliases command 65 # too. 66 rc, stdout, stderr = yield self.do_cli_unicode( 67 u"list-aliases", 68 [u"--json"], 69 encoding=encoding, 70 ) 34 71 35 72 self.assertEqual(0, rc) 36 73 data = json.loads(stdout) 37 self.assertIn( u"tahoe", data)38 data = data[ u"tahoe"]39 self.assertIn( "readwrite", data)40 self.assertIn( "readonly", data)74 self.assertIn(alias, data) 75 data = data[alias] 76 self.assertIn(u"readwrite", data) 77 self.assertIn(u"readonly", data) 41 78 42 @inlineCallbacks 43 def test_list_ unicode_mismatch_json(self):79 80 def test_list_none(self): 44 81 """ 45 pretty hack-y test, but we want to cover the 'except' on Unicode 46 errors paths and I can't come up with a nicer way to trigger 47 this 82 An alias composed of all ASCII-encodeable code points can be created when 83 stdio aren't clearly marked with an encoding. 48 84 """ 49 self.basedir = "cli/ListAlias/test_list_unicode_mismatch_json" 50 skip_if_cannot_represent_argv(u"tahoe\u263A") 51 self.set_up_grid(oneshare=True) 52 53 rc, stdout, stderr = yield self.do_cli( 54 "create-alias", 55 unicode_to_argv(u"tahoe\u263A"), 85 return self._check_create_alias( 86 u"tahoe", 87 encoding=None, 56 88 ) 57 89 58 self.failUnless(unicode_to_argv(u"Alias 'tahoe\u263A' created") in stdout)59 self.failIf(stderr)60 90 61 booms = [] 62 63 def boom(out, indent=4): 64 if not len(booms): 65 booms.append(out) 66 raise UnicodeEncodeError("foo", u"foo", 3, 5, "foo") 67 return str(out) 68 69 with patch("allmydata.scripts.tahoe_add_alias.json.dumps", boom): 70 aliases = get_aliases(self.get_clientdir()) 71 self.failUnless(u"tahoe\u263A" in aliases) 72 self.failUnless(aliases[u"tahoe\u263A"].startswith("URI:DIR2:")) 73 74 rc, stdout, stderr = yield self.do_cli("list-aliases", "--json") 75 76 self.assertEqual(1, rc) 77 self.assertIn("could not be converted", stderr) 78 79 @inlineCallbacks 80 def test_list_unicode_mismatch(self): 81 self.basedir = "cli/ListAlias/test_list_unicode_mismatch" 82 skip_if_cannot_represent_argv(u"tahoe\u263A") 83 self.set_up_grid(oneshare=True) 84 85 rc, stdout, stderr = yield self.do_cli( 86 "create-alias", 87 unicode_to_argv(u"tahoe\u263A"), 91 def test_list_ascii(self): 92 """ 93 An alias composed of all ASCII-encodeable code points can be created when 94 the active encoding is ASCII. 95 """ 96 return self._check_create_alias( 97 u"tahoe", 98 encoding="ascii", 88 99 ) 89 100 90 def boom(out):91 print("boom {}".format(out))92 return out93 raise UnicodeEncodeError("foo", u"foo", 3, 5, "foo")94 101 95 with patch("allmydata.scripts.tahoe_add_alias.unicode_to_output", boom): 96 self.failUnless(unicode_to_argv(u"Alias 'tahoe\u263A' created") in stdout) 97 self.failIf(stderr) 98 aliases = get_aliases(self.get_clientdir()) 99 self.failUnless(u"tahoe\u263A" in aliases) 100 self.failUnless(aliases[u"tahoe\u263A"].startswith("URI:DIR2:")) 102 def test_list_latin_1(self): 103 """ 104 An alias composed of all Latin-1-encodeable code points can be created 105 when the active encoding is Latin-1. 101 106 102 rc, stdout, stderr = yield self.do_cli("list-aliases") 107 This is very similar to ``test_list_utf_8`` but the assumption of 108 UTF-8 is nearly ubiquitous and explicitly exercising the codepaths 109 with a UTF-8-incompatible encoding helps flush out unintentional UTF-8 110 assumptions. 111 """ 112 return self._check_create_alias( 113 u"taho\N{LATIN SMALL LETTER E WITH ACUTE}", 114 encoding="latin-1", 115 ) 103 116 104 self.assertEqual(1, rc) 105 self.assertIn("could not be converted", stderr) 117 118 def test_list_utf_8(self): 119 """ 120 An alias composed of all UTF-8-encodeable code points can be created when 121 the active encoding is UTF-8. 122 """ 123 return self._check_create_alias( 124 u"tahoe\N{SNOWMAN}", 125 encoding="utf-8", 126 ) -
TabularUnified src/allmydata/test/cli/test_cp.py ¶
r07e4fe84 rd76bea4 662 662 # a local directory without a specified file name. 663 663 # https://tahoe-lafs.org/trac/tahoe-lafs/ticket/2027 664 self.basedir = "cli/Cp/ cp_verbose"664 self.basedir = "cli/Cp/ticket_2027" 665 665 self.set_up_grid(oneshare=True) 666 666 -
TabularUnified src/allmydata/test/common.py ¶
r07e4fe84 rd76bea4 11 11 "skipIf", 12 12 ] 13 14 from past.builtins import chr as byteschr 13 15 14 16 import os, random, struct … … 215 217 :ivar FilePath basedir: The base directory of the node. 216 218 217 :ivar bytesintroducer_furl: The introducer furl with which to219 :ivar str introducer_furl: The introducer furl with which to 218 220 configure the client. 219 221 … … 226 228 storage_plugin = attr.ib() 227 229 basedir = attr.ib(validator=attr.validators.instance_of(FilePath)) 228 introducer_furl = attr.ib(validator=attr.validators.instance_of(bytes)) 230 introducer_furl = attr.ib(validator=attr.validators.instance_of(str), 231 converter=six.ensure_str) 229 232 node_config = attr.ib(default=attr.Factory(dict)) 230 233 … … 1057 1060 offset = 0x0c+0x44+sharedatasize-1 1058 1061 1059 newdata = data[:offset] + chr(ord(data[offset])^0xFF) + data[offset+1:]1062 newdata = data[:offset] + byteschr(ord(data[offset:offset+1])^0xFF) + data[offset+1:] 1060 1063 if debug: 1061 1064 log.msg("testing: flipping all bits of byte at offset %d: %r, newdata: %r" % (offset, data[offset], newdata[offset])) … … 1085 1088 if debug: 1086 1089 log.msg("original data: %r" % (data,)) 1087 return data[:0x0c+0x221] + chr(ord(data[0x0c+0x221])^0x02) + data[0x0c+0x2210+1:]1090 return data[:0x0c+0x221] + byteschr(ord(data[0x0c+0x221:0x0c+0x221+1])^0x02) + data[0x0c+0x2210+1:] 1088 1091 1089 1092 def _corrupt_block_hashes(data, debug=False): -
TabularUnified src/allmydata/test/common_util.py ¶
r07e4fe84 rd76bea4 6 6 from random import randrange 7 7 from six.moves import StringIO 8 from io import ( 9 TextIOWrapper, 10 BytesIO, 11 ) 8 12 9 13 from twisted.internet import reactor, defer … … 36 40 raise unittest.SkipTest("A non-ASCII argv could not be encoded on this platform.") 37 41 38 def run_cli(verb, *args, **kwargs): 39 precondition(not [True for arg in args if not isinstance(arg, str)], 40 "arguments to do_cli must be strs -- convert using unicode_to_argv", args=args) 41 nodeargs = kwargs.get("nodeargs", []) 42 43 def _getvalue(io): 44 """ 45 Read out the complete contents of a file-like object. 46 """ 47 io.seek(0) 48 return io.read() 49 50 51 def run_cli_bytes(verb, *args, **kwargs): 52 """ 53 Run a Tahoe-LAFS CLI command specified as bytes. 54 55 Most code should prefer ``run_cli_unicode`` which deals with all the 56 necessary encoding considerations. This helper still exists so that novel 57 misconfigurations can be explicitly tested (for example, receiving UTF-8 58 bytes when the system encoding claims to be ASCII). 59 60 :param bytes verb: The command to run. For example, ``b"create-node"``. 61 62 :param [bytes] args: The arguments to pass to the command. For example, 63 ``(b"--hostname=localhost",)``. 64 65 :param [bytes] nodeargs: Extra arguments to pass to the Tahoe executable 66 before ``verb``. 67 68 :param bytes stdin: Text to pass to the command via stdin. 69 70 :param NoneType|str encoding: The name of an encoding which stdout and 71 stderr will be configured to use. ``None`` means stdout and stderr 72 will accept bytes and unicode and use the default system encoding for 73 translating between them. 74 """ 75 nodeargs = kwargs.pop("nodeargs", []) 76 encoding = kwargs.pop("encoding", None) 77 precondition( 78 all(isinstance(arg, bytes) for arg in [verb] + nodeargs + list(args)), 79 "arguments to run_cli must be bytes -- convert using unicode_to_argv", 80 verb=verb, 81 args=args, 82 nodeargs=nodeargs, 83 ) 42 84 argv = nodeargs + [verb] + list(args) 43 85 stdin = kwargs.get("stdin", "") 44 stdout = StringIO() 45 stderr = StringIO() 86 if encoding is None: 87 # The original behavior, the Python 2 behavior, is to accept either 88 # bytes or unicode and try to automatically encode or decode as 89 # necessary. This works okay for ASCII and if LANG is set 90 # appropriately. These aren't great constraints so we should move 91 # away from this behavior. 92 stdout = StringIO() 93 stderr = StringIO() 94 else: 95 # The new behavior, the Python 3 behavior, is to accept unicode and 96 # encode it using a specific encoding. For older versions of Python 97 # 3, the encoding is determined from LANG (bad) but for newer Python 98 # 3, the encoding is always utf-8 (good). Tests can pass in different 99 # encodings to exercise different behaviors. 100 stdout = TextIOWrapper(BytesIO(), encoding) 101 stderr = TextIOWrapper(BytesIO(), encoding) 46 102 d = defer.succeed(argv) 47 103 d.addCallback(runner.parse_or_exit_with_explanation, stdout=stdout) … … 50 106 stdout=stdout, stderr=stderr) 51 107 def _done(rc): 52 return 0, stdout.getvalue(), stderr.getvalue()108 return 0, _getvalue(stdout), _getvalue(stderr) 53 109 def _err(f): 54 110 f.trap(SystemExit) 55 return f.value.code, stdout.getvalue(), stderr.getvalue()111 return f.value.code, _getvalue(stdout), _getvalue(stderr) 56 112 d.addCallbacks(_done, _err) 57 113 return d 114 115 116 def run_cli_unicode(verb, argv, nodeargs=None, stdin=None, encoding=None): 117 """ 118 Run a Tahoe-LAFS CLI command. 119 120 :param unicode verb: The command to run. For example, ``u"create-node"``. 121 122 :param [unicode] argv: The arguments to pass to the command. For example, 123 ``[u"--hostname=localhost"]``. 124 125 :param [unicode] nodeargs: Extra arguments to pass to the Tahoe executable 126 before ``verb``. 127 128 :param unicode stdin: Text to pass to the command via stdin. 129 130 :param NoneType|str encoding: The name of an encoding to use for all 131 bytes/unicode conversions necessary *and* the encoding to cause stdio 132 to declare with its ``encoding`` attribute. ``None`` means ASCII will 133 be used and no declaration will be made at all. 134 """ 135 if nodeargs is None: 136 nodeargs = [] 137 precondition( 138 all(isinstance(arg, unicode) for arg in [verb] + nodeargs + argv), 139 "arguments to run_cli_unicode must be unicode", 140 verb=verb, 141 nodeargs=nodeargs, 142 argv=argv, 143 ) 144 codec = encoding or "ascii" 145 encode = lambda t: None if t is None else t.encode(codec) 146 d = run_cli_bytes( 147 encode(verb), 148 nodeargs=list(encode(arg) for arg in nodeargs), 149 stdin=encode(stdin), 150 encoding=encoding, 151 *list(encode(arg) for arg in argv) 152 ) 153 def maybe_decode(result): 154 code, stdout, stderr = result 155 if isinstance(stdout, bytes): 156 stdout = stdout.decode(codec) 157 if isinstance(stderr, bytes): 158 stderr = stderr.decode(codec) 159 return code, stdout, stderr 160 d.addCallback(maybe_decode) 161 return d 162 163 164 run_cli = run_cli_bytes 165 58 166 59 167 def parse_cli(*argv): -
TabularUnified src/allmydata/test/mutable/util.py ¶
r07e4fe84 rd76bea4 240 240 fss = FakeStorageServer(peerid, s) 241 241 ann = { 242 "anonymous-storage-FURL": b"pb://%s@nowhere/fake" % (peerid,),242 "anonymous-storage-FURL": "pb://%s@nowhere/fake" % (str(peerid, "utf-8"),), 243 243 "permutation-seed-base32": peerid, 244 244 } -
TabularUnified src/allmydata/test/test_checker.py ¶
r07e4fe84 rd76bea4 157 157 server_id = key_s 158 158 tubid_b32 = base32.b2a(binary_tubid) 159 furl = b"pb://%s@nowhere/fake" % tubid_b32159 furl = "pb://%s@nowhere/fake" % str(tubid_b32, "utf-8") 160 160 ann = { "version": 0, 161 161 "service-name": "storage", -
TabularUnified src/allmydata/test/test_client.py ¶
r07e4fe84 rd76bea4 89 89 ) 90 90 91 SOME_FURL = b"pb://abcde@nowhere/fake"91 SOME_FURL = "pb://abcde@nowhere/fake" 92 92 93 93 BASECONFIG = "[client]\n" -
TabularUnified src/allmydata/test/test_introducer.py ¶
r07e4fe84 rd76bea4 217 217 announcements.append( (key_s, ann) ) 218 218 ic1.subscribe_to("storage", _received) 219 furl1 = b"pb://62ubehyunnyhzs7r6vdonnm2hpi52w6y@127.0.0.1:36106/gydnp"220 furl1a = b"pb://62ubehyunnyhzs7r6vdonnm2hpi52w6y@127.0.0.1:7777/gydnp"221 furl2 = b"pb://ttwwooyunnyhzs7r6vdonnm2hpi52w6y@127.0.0.1:36106/ttwwoo"219 furl1 = "pb://62ubehyunnyhzs7r6vdonnm2hpi52w6y@127.0.0.1:36106/gydnp" 220 furl1a = "pb://62ubehyunnyhzs7r6vdonnm2hpi52w6y@127.0.0.1:7777/gydnp" 221 furl2 = "pb://ttwwooyunnyhzs7r6vdonnm2hpi52w6y@127.0.0.1:36106/ttwwoo" 222 222 223 223 private_key, public_key = ed25519.create_signing_keypair() … … 243 243 key_s,ann = announcements[0] 244 244 self.failUnlessEqual(key_s, pubkey_s) 245 self.failUnlessEqual( ensure_binary(ann["anonymous-storage-FURL"]), furl1)245 self.failUnlessEqual(ann["anonymous-storage-FURL"], furl1) 246 246 self.failUnlessEqual(ann["my-version"], "ver23") 247 247 d.addCallback(_then1) … … 277 277 key_s,ann = announcements[-1] 278 278 self.failUnlessEqual(key_s, pubkey_s) 279 self.failUnlessEqual( ensure_binary(ann["anonymous-storage-FURL"]), furl1)279 self.failUnlessEqual(ann["anonymous-storage-FURL"], furl1) 280 280 self.failUnlessEqual(ann["my-version"], "ver24") 281 281 d.addCallback(_then3) … … 289 289 key_s,ann = announcements[-1] 290 290 self.failUnlessEqual(key_s, pubkey_s) 291 self.failUnlessEqual( ensure_binary(ann["anonymous-storage-FURL"]), furl1a)291 self.failUnlessEqual(ann["anonymous-storage-FURL"], furl1a) 292 292 self.failUnlessEqual(ann["my-version"], "ver23") 293 293 d.addCallback(_then4) … … 305 305 key_s,ann = announcements2[-1] 306 306 self.failUnlessEqual(key_s, pubkey_s) 307 self.failUnlessEqual( ensure_binary(ann["anonymous-storage-FURL"]), furl1a)307 self.failUnlessEqual(ann["anonymous-storage-FURL"], furl1a) 308 308 self.failUnlessEqual(ann["my-version"], "ver23") 309 309 d.addCallback(_then5) … … 317 317 "ver23", "oldest_version", realseq, 318 318 FilePath(self.mktemp())) 319 furl1 = b"pb://62ubehyunnyhzs7r6vdonnm2hpi52w6y@127.0.0.1:36106/gydnp"319 furl1 = "pb://62ubehyunnyhzs7r6vdonnm2hpi52w6y@127.0.0.1:36106/gydnp" 320 320 321 321 private_key, _ = ed25519.create_signing_keypair() … … 415 415 u"nickname", "version", "oldest", fakeseq, 416 416 FilePath(self.mktemp())) 417 furl1 = b"pb://onug64tu@127.0.0.1:123/short" # base32("short")417 furl1 = "pb://onug64tu@127.0.0.1:123/short" # base32("short") 418 418 private_key, _ = ed25519.create_signing_keypair() 419 419 … … 437 437 v = introducer.get_announcements()[0] 438 438 furl = v.announcement["anonymous-storage-FURL"] 439 self.failUnlessEqual( ensure_binary(furl), furl1)439 self.failUnlessEqual(furl, furl1) 440 440 d.addCallback(_done) 441 441 … … 463 463 tub = self.central_tub 464 464 ifurl = self.central_tub.registerReference(introducer, furlFile=iff) 465 self.introducer_furl = ifurl .encode("utf-8")465 self.introducer_furl = ifurl 466 466 467 467 # we have 5 clients who publish themselves as storage servers, and a … … 504 504 expected_announcements[i] += 1 # all expect a 'storage' announcement 505 505 506 node_furl = tub.registerReference(Referenceable()) .encode("utf-8")506 node_furl = tub.registerReference(Referenceable()) 507 507 private_key, public_key = ed25519.create_signing_keypair() 508 508 public_key_str = ed25519.string_from_verifying_key(public_key) … … 521 521 if i == 2: 522 522 # also publish something that nobody cares about 523 boring_furl = tub.registerReference(Referenceable()) .encode("utf-8")523 boring_furl = tub.registerReference(Referenceable()) 524 524 c.publish("boring", make_ann(boring_furl), private_key) 525 525 … … 659 659 newfurl = self.central_tub.registerReference(self.the_introducer, 660 660 furlFile=iff) 661 assert ensure_binary(newfurl)== self.introducer_furl661 assert newfurl == self.introducer_furl 662 662 d.addCallback(_restart_introducer_tub) 663 663 … … 711 711 newfurl = self.central_tub.registerReference(self.the_introducer, 712 712 furlFile=iff) 713 assert ensure_binary(newfurl)== self.introducer_furl713 assert newfurl == self.introducer_furl 714 714 d.addCallback(_restart_introducer) 715 715 … … 755 755 "my_version", "oldest", 756 756 fakeseq, FilePath(self.mktemp())) 757 #furl1 = b"pb://62ubehyunnyhzs7r6vdonnm2hpi52w6y@127.0.0.1:0/swissnum"757 #furl1 = "pb://62ubehyunnyhzs7r6vdonnm2hpi52w6y@127.0.0.1:0/swissnum" 758 758 #ann_s = make_ann_t(client_v2, furl1, None, 10) 759 759 #introducer.remote_publish_v2(ann_s, Referenceable()) … … 776 776 "my_version", "oldest", 777 777 fakeseq, FilePath(self.mktemp())) 778 furl1 = b"pb://62ubehyunnyhzs7r6vdonnm2hpi52w6y@127.0.0.1:0/swissnum"778 furl1 = "pb://62ubehyunnyhzs7r6vdonnm2hpi52w6y@127.0.0.1:0/swissnum" 779 779 780 780 private_key, public_key = ed25519.create_signing_keypair() … … 791 791 self.failUnlessEqual(a[0].service_name, "storage") 792 792 self.failUnlessEqual(a[0].version, "my_version") 793 self.failUnlessEqual( ensure_binary(a[0].announcement["anonymous-storage-FURL"]), furl1)793 self.failUnlessEqual(a[0].announcement["anonymous-storage-FURL"], furl1) 794 794 795 795 def _load_cache(self, cache_filepath): … … 824 824 private_key, public_key = ed25519.create_signing_keypair() 825 825 public_key_str = remove_prefix(ed25519.string_from_verifying_key(public_key), b"pub-") 826 furl1 = b"pb://onug64tu@127.0.0.1:123/short" # base32("short")826 furl1 = "pb://onug64tu@127.0.0.1:123/short" # base32("short") 827 827 ann_t = make_ann_t(ic, furl1, private_key, 1) 828 828 … … 835 835 self.failUnlessEqual(ensure_binary(announcements[0]['key_s']), public_key_str) 836 836 ann = announcements[0]["ann"] 837 self.failUnlessEqual( ensure_binary(ann["anonymous-storage-FURL"]), furl1)837 self.failUnlessEqual(ann["anonymous-storage-FURL"], furl1) 838 838 self.failUnlessEqual(ann["seqnum"], 1) 839 839 840 840 # a new announcement that replaces the first should replace the 841 841 # cached entry, not duplicate it 842 furl2 = furl1 + b"er"842 furl2 = furl1 + "er" 843 843 ann_t2 = make_ann_t(ic, furl2, private_key, 2) 844 844 ic.got_announcements([ann_t2]) … … 848 848 self.failUnlessEqual(ensure_binary(announcements[0]['key_s']), public_key_str) 849 849 ann = announcements[0]["ann"] 850 self.failUnlessEqual( ensure_binary(ann["anonymous-storage-FURL"]), furl2)850 self.failUnlessEqual(ann["anonymous-storage-FURL"], furl2) 851 851 self.failUnlessEqual(ann["seqnum"], 2) 852 852 … … 855 855 private_key2, public_key2 = ed25519.create_signing_keypair() 856 856 public_key_str2 = remove_prefix(ed25519.string_from_verifying_key(public_key2), b"pub-") 857 furl3 = b"pb://onug64tu@127.0.0.1:456/short"857 furl3 = "pb://onug64tu@127.0.0.1:456/short" 858 858 ann_t3 = make_ann_t(ic, furl3, private_key2, 1) 859 859 ic.got_announcements([ann_t3]) … … 865 865 set([ensure_binary(a["key_s"]) for a in announcements])) 866 866 self.failUnlessEqual(set([furl2, furl3]), 867 set([ ensure_binary(a["ann"]["anonymous-storage-FURL"])867 set([a["ann"]["anonymous-storage-FURL"] 868 868 for a in announcements])) 869 869 … … 881 881 882 882 self.failUnless(public_key_str in announcements) 883 self.failUnlessEqual( ensure_binary(announcements[public_key_str]["anonymous-storage-FURL"]),883 self.failUnlessEqual(announcements[public_key_str]["anonymous-storage-FURL"], 884 884 furl2) 885 self.failUnlessEqual( ensure_binary(announcements[public_key_str2]["anonymous-storage-FURL"]),885 self.failUnlessEqual(announcements[public_key_str2]["anonymous-storage-FURL"], 886 886 furl3) 887 887 … … 998 998 # make sure we have a working base64.b32decode. The one in 999 999 # python2.4.[01] was broken. 1000 furl = b'pb://t5g7egomnnktbpydbuijt6zgtmw4oqi5@127.0.0.1:51857/hfzv36i'1001 m = re.match( br'pb://(\w+)@', furl)1000 furl = 'pb://t5g7egomnnktbpydbuijt6zgtmw4oqi5@127.0.0.1:51857/hfzv36i' 1001 m = re.match(r'pb://(\w+)@', furl) 1002 1002 assert m 1003 nodeid = b32decode(m.group(1).upper() )1003 nodeid = b32decode(m.group(1).upper().encode("ascii")) 1004 1004 self.failUnlessEqual(nodeid, b"\x9fM\xf2\x19\xcckU0\xbf\x03\r\x10\x99\xfb&\x9b-\xc7A\x1d") 1005 1005 … … 1042 1042 ic = IntroducerClient( 1043 1043 mock_tub, 1044 b"pb://",1044 "pb://", 1045 1045 u"fake_nick", 1046 1046 "0.0.0", -
TabularUnified src/allmydata/test/test_repairer.py ¶
r07e4fe84 rd76bea4 1 1 # -*- coding: utf-8 -*- 2 """ 3 Ported to Python 3. 4 """ 2 5 from __future__ import print_function 6 from __future__ import absolute_import 7 from __future__ import division 8 from __future__ import unicode_literals 9 10 from future.utils import PY2 11 if PY2: 12 from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401 3 13 4 14 from allmydata.test import common … … 63 73 c1 = self.g.clients[1] 64 74 c0.encoding_params['max_segment_size'] = 12 65 d = c0.upload(upload.Data(common.TEST_DATA, convergence= ""))75 d = c0.upload(upload.Data(common.TEST_DATA, convergence=b"")) 66 76 def _stash_uri(ur): 67 77 self.uri = ur.get_uri() … … 465 475 466 476 d.addCallback(lambda ignored: 467 self.delete_shares_numbered(self.uri, range(3, 10+1)))477 self.delete_shares_numbered(self.uri, list(range(3, 10+1)))) 468 478 d.addCallback(lambda ignored: download_to_data(self.c1_filenode)) 469 479 d.addCallback(lambda newdata: … … 477 487 d = self.upload_and_stash() 478 488 d.addCallback(lambda ignored: 479 self.delete_shares_numbered(self.uri, range(7)))489 self.delete_shares_numbered(self.uri, list(range(7)))) 480 490 d.addCallback(lambda ignored: self._stash_counts()) 481 491 d.addCallback(lambda ignored: … … 510 520 511 521 d.addCallback(lambda ignored: 512 self.delete_shares_numbered(self.uri, range(3, 10+1)))522 self.delete_shares_numbered(self.uri, list(range(3, 10+1)))) 513 523 d.addCallback(lambda ignored: download_to_data(self.c1_filenode)) 514 524 d.addCallback(lambda newdata: … … 528 538 # happiness setting. 529 539 def _delete_some_servers(ignored): 530 for i in xrange(7):540 for i in range(7): 531 541 self.g.remove_server(self.g.servers_by_number[i].my_nodeid) 532 542 … … 641 651 # unless it has already repaired the previously-corrupted share. 642 652 def _then_delete_7_and_try_a_download(unused=None): 643 shnums = range(10)653 shnums = list(range(10)) 644 654 shnums.remove(shnum) 645 655 random.shuffle(shnums) … … 680 690 self.set_up_grid() 681 691 c0 = self.g.clients[0] 682 DATA = "a"*135692 DATA = b"a"*135 683 693 c0.encoding_params['k'] = 22 684 694 c0.encoding_params['n'] = 66 685 d = c0.upload(upload.Data(DATA, convergence= ""))695 d = c0.upload(upload.Data(DATA, convergence=b"")) 686 696 def _then(ur): 687 697 self.uri = ur.get_uri() -
TabularUnified src/allmydata/test/test_runner.py ¶
r07e4fe84 rd76bea4 143 143 144 144 class CreateNode(unittest.TestCase): 145 # exercise "tahoe create-node", create-introducer, 146 # create-key-generator, and create-stats-gatherer, by calling the 147 # corresponding code as a subroutine. 145 # exercise "tahoe create-node", create-introducer, and 146 # create-key-generator by calling the corresponding code as a subroutine. 148 147 149 148 def workdir(self, name): … … 244 243 self.do_create("introducer", "--hostname=127.0.0.1") 245 244 246 def test_stats_gatherer(self):247 self.do_create("stats-gatherer", "--hostname=127.0.0.1")248 249 245 def test_subcommands(self): 250 246 # no arguments should trigger a command listing, via UsageError 251 247 self.failUnlessRaises(usage.UsageError, parse_cli, 252 248 ) 253 254 @inlineCallbacks255 def test_stats_gatherer_good_args(self):256 rc,out,err = yield run_cli("create-stats-gatherer", "--hostname=foo",257 self.mktemp())258 self.assertEqual(rc, 0)259 rc,out,err = yield run_cli("create-stats-gatherer",260 "--location=tcp:foo:1234",261 "--port=tcp:1234", self.mktemp())262 self.assertEqual(rc, 0)263 264 265 def test_stats_gatherer_bad_args(self):266 def _test(args):267 argv = args.split()268 self.assertRaises(usage.UsageError, parse_cli, *argv)269 270 # missing hostname/location/port271 _test("create-stats-gatherer D")272 273 # missing port274 _test("create-stats-gatherer --location=foo D")275 276 # missing location277 _test("create-stats-gatherer --port=foo D")278 279 # can't provide both280 _test("create-stats-gatherer --hostname=foo --port=foo D")281 282 # can't provide both283 _test("create-stats-gatherer --hostname=foo --location=foo D")284 285 # can't provide all three286 _test("create-stats-gatherer --hostname=foo --location=foo --port=foo D")287 249 288 250 -
TabularUnified src/allmydata/test/test_storage_client.py ¶
r07e4fe84 rd76bea4 104 104 ) 105 105 106 SOME_FURL = b"pb://abcde@nowhere/fake"106 SOME_FURL = "pb://abcde@nowhere/fake" 107 107 108 108 class NativeStorageServerWithVersion(NativeStorageServer): … … 311 311 # than the one that is enabled. 312 312 u"name": u"tahoe-lafs-dummy-v2", 313 u"storage-server-FURL": SOME_FURL .decode("ascii"),313 u"storage-server-FURL": SOME_FURL, 314 314 }], 315 315 } … … 339 339 # and this announcement is for a plugin with a matching name 340 340 u"name": plugin_name, 341 u"storage-server-FURL": SOME_FURL .decode("ascii"),341 u"storage-server-FURL": SOME_FURL, 342 342 }], 343 343 } … … 390 390 # and this announcement is for a plugin with a matching name 391 391 u"name": plugin_name, 392 u"storage-server-FURL": SOME_FURL .decode("ascii"),392 u"storage-server-FURL": SOME_FURL, 393 393 }], 394 394 } … … 595 595 anonymous-storage-FURL: {furl} 596 596 permutation-seed-base32: aaaaaaaaaaaaaaaaaaaaaaaa 597 """.format(furl=SOME_FURL .decode("utf-8"))597 """.format(furl=SOME_FURL) 598 598 servers = yamlutil.safe_load(servers_yaml) 599 599 permseed = base32.a2b(b"aaaaaaaaaaaaaaaaaaaaaaaa") … … 611 611 ann2 = { 612 612 "service-name": "storage", 613 "anonymous-storage-FURL": "pb://{}@nowhere/fake2".format( base32.b2a(b"1")),613 "anonymous-storage-FURL": "pb://{}@nowhere/fake2".format(str(base32.b2a(b"1"), "utf-8")), 614 614 "permutation-seed-base32": "bbbbbbbbbbbbbbbbbbbbbbbb", 615 615 } … … 695 695 696 696 def add_one_server(x): 697 data["anonymous-storage-FURL"] = b"pb://%s@spy:nowhere/fake" % (base32.b2a(b"%d" % x),)697 data["anonymous-storage-FURL"] = "pb://%s@spy:nowhere/fake" % (str(base32.b2a(b"%d" % x), "ascii"),) 698 698 tub = new_tub() 699 699 connects = [] -
TabularUnified src/allmydata/test/test_system.py ¶
r07e4fe84 rd76bea4 24 24 from allmydata.util.fileutil import abspath_expanduser_unicode 25 25 from allmydata.util.consumer import MemoryConsumer, download_to_data 26 from allmydata.stats import StatsGathererService27 26 from allmydata.interfaces import IDirectoryNode, IFileNode, \ 28 27 NoSuchChildError, NoSharesError … … 668 667 self.sparent.startService() 669 668 670 self.stats_gatherer = None671 self.stats_gatherer_furl = None672 673 669 def tearDown(self): 674 670 log.msg("shutting down SystemTest services") … … 714 710 715 711 @inlineCallbacks 716 def set_up_nodes(self, NUMCLIENTS=5 , use_stats_gatherer=False):712 def set_up_nodes(self, NUMCLIENTS=5): 717 713 """ 718 714 Create an introducer and ``NUMCLIENTS`` client nodes pointed at it. All … … 726 722 727 723 :param int NUMCLIENTS: The number of client nodes to create. 728 729 :param bool use_stats_gatherer: If ``True`` then also create a stats730 gatherer and configure the other nodes to use it.731 724 732 725 :return: A ``Deferred`` that fires when the nodes have connected to … … 738 731 self.add_service(self.introducer) 739 732 self.introweb_url = self._get_introducer_web() 740 741 if use_stats_gatherer:742 yield self._set_up_stats_gatherer()743 733 yield self._set_up_client_nodes() 744 if use_stats_gatherer:745 yield self._grab_stats()746 747 def _set_up_stats_gatherer(self):748 statsdir = self.getdir("stats_gatherer")749 fileutil.make_dirs(statsdir)750 751 location_hint, port_endpoint = self.port_assigner.assign(reactor)752 fileutil.write(os.path.join(statsdir, "location"), location_hint)753 fileutil.write(os.path.join(statsdir, "port"), port_endpoint)754 self.stats_gatherer_svc = StatsGathererService(statsdir)755 self.stats_gatherer = self.stats_gatherer_svc.stats_gatherer756 self.stats_gatherer_svc.setServiceParent(self.sparent)757 758 d = fireEventually()759 sgf = os.path.join(statsdir, 'stats_gatherer.furl')760 def check_for_furl():761 return os.path.exists(sgf)762 d.addCallback(lambda junk: self.poll(check_for_furl, timeout=30))763 def get_furl(junk):764 self.stats_gatherer_furl = file(sgf, 'rb').read().strip()765 d.addCallback(get_furl)766 return d767 734 768 735 @inlineCallbacks … … 834 801 config.setdefault(section, {})[feature] = value 835 802 836 setclient = partial(setconf, config, which, "client")837 803 setnode = partial(setconf, config, which, "node") 838 804 sethelper = partial(setconf, config, which, "helper") 839 805 840 806 setnode("nickname", u"client %d \N{BLACK SMILING FACE}" % (which,)) 841 842 if self.stats_gatherer_furl:843 setclient("stats_gatherer.furl", self.stats_gatherer_furl)844 807 845 808 tub_location_hint, tub_port_endpoint = self.port_assigner.assign(reactor) … … 872 835 fileutil.write(os.path.join(basedir, 'tahoe.cfg'), config) 873 836 return basedir 874 875 def _grab_stats(self):876 d = self.stats_gatherer.poll()877 return d878 837 879 838 def bounce_client(self, num): … … 1304 1263 1305 1264 def _grab_stats(ignored): 1306 # the StatsProvider doesn't normally publish a FURL: 1307 # instead it passes a live reference to the StatsGatherer 1308 # (if and when it connects). To exercise the remote stats 1309 # interface, we manually publish client0's StatsProvider 1310 # and use client1 to query it. 1311 sp = self.clients[0].stats_provider 1312 sp_furl = self.clients[0].tub.registerReference(sp) 1313 d = self.clients[1].tub.getReference(sp_furl) 1314 d.addCallback(lambda sp_rref: sp_rref.callRemote("get_stats")) 1315 def _got_stats(stats): 1316 #print("STATS") 1317 #from pprint import pprint 1318 #pprint(stats) 1319 s = stats["stats"] 1320 self.failUnlessEqual(s["storage_server.accepting_immutable_shares"], 1) 1321 c = stats["counters"] 1322 self.failUnless("storage_server.allocate" in c) 1323 d.addCallback(_got_stats) 1324 return d 1265 stats = self.clients[0].stats_provider.get_stats() 1266 s = stats["stats"] 1267 self.failUnlessEqual(s["storage_server.accepting_immutable_shares"], 1) 1268 c = stats["counters"] 1269 self.failUnless("storage_server.allocate" in c) 1325 1270 d.addCallback(_grab_stats) 1326 1271 … … 1630 1575 self.basedir = "system/SystemTest/test_filesystem" 1631 1576 self.data = LARGE_DATA 1632 d = self.set_up_nodes( use_stats_gatherer=True)1577 d = self.set_up_nodes() 1633 1578 def _new_happy_semantics(ign): 1634 1579 for c in self.clients: … … 2619 2564 def _run_in_subprocess(ignored, verb, *args, **kwargs): 2620 2565 stdin = kwargs.get("stdin") 2566 # XXX https://tahoe-lafs.org/trac/tahoe-lafs/ticket/3548 2621 2567 env = kwargs.get("env", os.environ) 2622 2568 # Python warnings from the child process don't matter. -
TabularUnified src/allmydata/test/test_upload.py ¶
r07e4fe84 rd76bea4 240 240 ) 241 241 for (serverid, rref) in servers: 242 ann = {"anonymous-storage-FURL": b"pb://%s@nowhere/fake" % base32.b2a(serverid),242 ann = {"anonymous-storage-FURL": "pb://%s@nowhere/fake" % str(base32.b2a(serverid), "ascii"), 243 243 "permutation-seed-base32": base32.b2a(serverid) } 244 244 self.storage_broker.test_add_rref(serverid, rref, ann) -
TabularUnified src/allmydata/util/_python3.py ¶
r07e4fe84 rd76bea4 36 36 "allmydata.crypto.util", 37 37 "allmydata.hashtree", 38 "allmydata.immutable.checker", 38 39 "allmydata.immutable.downloader", 39 40 "allmydata.immutable.downloader.common", … … 50 51 "allmydata.immutable.literal", 51 52 "allmydata.immutable.offloaded", 53 "allmydata.immutable.repairer", 52 54 "allmydata.immutable.upload", 53 55 "allmydata.interfaces", 56 "allmydata.introducer.client", 57 "allmydata.introducer.common", 54 58 "allmydata.introducer.interfaces", 59 "allmydata.introducer.server", 55 60 "allmydata.monitor", 56 61 "allmydata.mutable.checker", … … 152 157 "allmydata.test.test_pipeline", 153 158 "allmydata.test.test_python3", 159 "allmydata.test.test_repairer", 154 160 "allmydata.test.test_spans", 155 161 "allmydata.test.test_statistics", -
TabularUnified src/allmydata/util/encodingutil.py ¶
r07e4fe84 rd76bea4 252 252 253 253 ESCAPABLE_8BIT = re.compile( br'[^ !#\x25-\x5B\x5D-\x5F\x61-\x7E]', re.DOTALL) 254 255 def quote_output_u(*args, **kwargs): 256 """ 257 Like ``quote_output`` but always return ``unicode``. 258 """ 259 result = quote_output(*args, **kwargs) 260 if isinstance(result, unicode): 261 return result 262 return result.decode(kwargs.get("encoding", None) or io_encoding) 263 254 264 255 265 def quote_output(s, quotemarks=True, quote_newlines=None, encoding=None): -
TabularUnified src/allmydata/web/statistics.xhtml ¶
r07e4fe84 rd76bea4 13 13 14 14 <ul> 15 <li>Load Average: <t:transparent t:render="load_average" /></li>16 <li>Peak Load: <t:transparent t:render="peak_load" /></li>17 15 <li>Files Uploaded (immutable): <t:transparent t:render="uploads" /></li> 18 16 <li>Files Downloaded (immutable): <t:transparent t:render="downloads" /></li> -
TabularUnified src/allmydata/web/status.py ¶
r07e4fe84 rd76bea4 1567 1567 1568 1568 @renderer 1569 def load_average(self, req, tag):1570 return tag(str(self._stats["stats"].get("load_monitor.avg_load")))1571 1572 @renderer1573 def peak_load(self, req, tag):1574 return tag(str(self._stats["stats"].get("load_monitor.max_load")))1575 1576 @renderer1577 1569 def uploads(self, req, tag): 1578 1570 files = self._stats["counters"].get("uploader.files_uploaded", 0)
Note: See TracChangeset
for help on using the changeset viewer.