Changeset d76bea4 in trunk


Ignore:
Timestamp:
2020-12-11T15:32:20Z (4 years ago)
Author:
GitHub <noreply@…>
Branches:
master
Children:
e59a922
Parents:
07e4fe84 (diff), d096cc54 (diff)
Note: this is a merge changeset, the changes displayed below correspond to the merge itself.
Use the (diff) links above to see all the changes relative to each parent.
git-author:
jehadbaeth <60194206+jehadbaeth@…> (2020-12-11 15:32:20)
git-committer:
GitHub <noreply@…> (2020-12-11 15:32:20)
Message:

Merge branch 'master' into new-readme

Files:
5 added
1 deleted
34 edited

Legend:

Unmodified
Added
Removed
  • TabularUnified docs/configuration.rst

    r07e4fe84 rd76bea4  
    7676==========
    7777
    78 A node can be a client/server, an introducer, or a statistics gatherer.
     78A node can be a client/server or an introducer.
    7979
    8080Client/server nodes provide one or more of the following services:
     
    594594    for uploads. See :doc:`helper` for details.
    595595
    596 ``stats_gatherer.furl = (FURL string, optional)``
    597 
    598     If provided, the node will connect to the given stats gatherer and
    599     provide it with operational statistics.
    600 
    601596``shares.needed = (int, optional) aka "k", default 3``
    602597
     
    911906  This file is used to construct an introducer, and is created by the
    912907  "``tahoe create-introducer``" command.
    913 
    914 ``tahoe-stats-gatherer.tac``
    915 
    916   This file is used to construct a statistics gatherer, and is created by the
    917   "``tahoe create-stats-gatherer``" command.
    918908
    919909``private/control.furl``
  • TabularUnified docs/man/man1/tahoe.1

    r07e4fe84 rd76bea4  
    4646.B \f[B]create-introducer\f[]
    4747Create an introducer node.
    48 .TP
    49 .B \f[B]create-stats-gatherer\f[]
    50 Create a stats-gatherer service.
    5148.SS OPTIONS
    5249.TP
  • TabularUnified docs/stats.rst

    r07e4fe84 rd76bea4  
    771. `Overview`_
    882. `Statistics Categories`_
    9 3. `Running a Tahoe Stats-Gatherer Service`_
    10 4. `Using Munin To Graph Stats Values`_
     93. `Using Munin To Graph Stats Values`_
    1110
    1211Overview
     
    244243        sometimes be negative due to wraparound of the kernel's counter.
    245244
    246 **stats.load_monitor.\***
    247 
    248     When enabled, the "load monitor" continually schedules a one-second
    249     callback, and measures how late the response is. This estimates system load
    250     (if the system is idle, the response should be on time). This is only
    251     enabled if a stats-gatherer is configured.
    252 
    253     avg_load
    254         average "load" value (seconds late) over the last minute
    255 
    256     max_load
    257         maximum "load" value over the last minute
    258 
    259 
    260 Running a Tahoe Stats-Gatherer Service
    261 ======================================
    262 
    263 The "stats-gatherer" is a simple daemon that periodically collects stats from
    264 several tahoe nodes. It could be useful, e.g., in a production environment,
    265 where you want to monitor dozens of storage servers from a central management
    266 host. It merely gatherers statistics from many nodes into a single place: it
    267 does not do any actual analysis.
    268 
    269 The stats gatherer listens on a network port using the same Foolscap_
    270 connection library that Tahoe clients use to connect to storage servers.
    271 Tahoe nodes can be configured to connect to the stats gatherer and publish
    272 their stats on a periodic basis. (In fact, what happens is that nodes connect
    273 to the gatherer and offer it a second FURL which points back to the node's
    274 "stats port", which the gatherer then uses to pull stats on a periodic basis.
    275 The initial connection is flipped to allow the nodes to live behind NAT
    276 boxes, as long as the stats-gatherer has a reachable IP address.)
    277 
    278 .. _Foolscap: https://foolscap.lothar.com/trac
    279 
    280 The stats-gatherer is created in the same fashion as regular tahoe client
    281 nodes and introducer nodes. Choose a base directory for the gatherer to live
    282 in (but do not create the directory). Choose the hostname that should be
    283 advertised in the gatherer's FURL. Then run:
    284 
    285 ::
    286 
    287    tahoe create-stats-gatherer --hostname=HOSTNAME $BASEDIR
    288 
    289 and start it with "tahoe start $BASEDIR". Once running, the gatherer will
    290 write a FURL into $BASEDIR/stats_gatherer.furl .
    291 
    292 To configure a Tahoe client/server node to contact the stats gatherer, copy
    293 this FURL into the node's tahoe.cfg file, in a section named "[client]",
    294 under a key named "stats_gatherer.furl", like so:
    295 
    296 ::
    297 
    298     [client]
    299     stats_gatherer.furl = pb://qbo4ktl667zmtiuou6lwbjryli2brv6t@HOSTNAME:PORTNUM/wxycb4kaexzskubjnauxeoptympyf45y
    300 
    301 or simply copy the stats_gatherer.furl file into the node's base directory
    302 (next to the tahoe.cfg file): it will be interpreted in the same way.
    303 
    304 When the gatherer is created, it will allocate a random unused TCP port, so
    305 it should not conflict with anything else that you have running on that host
    306 at that time. To explicitly control which port it uses, run the creation
    307 command with ``--location=`` and ``--port=`` instead of ``--hostname=``. If
    308 you use a hostname of ``example.org`` and a port number of ``1234``, then
    309 run::
    310 
    311   tahoe create-stats-gatherer --location=tcp:example.org:1234 --port=tcp:1234
    312 
    313 ``--location=`` is a Foolscap FURL hints string (so it can be a
    314 comma-separated list of connection hints), and ``--port=`` is a Twisted
    315 "server endpoint specification string", as described in :doc:`configuration`.
    316 
    317 Once running, the stats gatherer will create a standard JSON file in
    318 ``$BASEDIR/stats.json``. Once a minute, the gatherer will pull stats
    319 information from every connected node and write them into the file. The file
    320 will contain a dictionary, in which node identifiers (known as "tubid"
    321 strings) are the keys, and the values are a dict with 'timestamp',
    322 'nickname', and 'stats' keys. d[tubid][stats] will contain the stats
    323 dictionary as made available at http://localhost:3456/statistics?t=json . The
    324 file will only contain the most recent update from each node.
    325 
    326 Other tools can be built to examine these stats and render them into
    327 something useful. For example, a tool could sum the
    328 "storage_server.disk_avail' values from all servers to compute a
    329 total-disk-available number for the entire grid (however, the "disk watcher"
    330 daemon, in misc/operations_helpers/spacetime/, is better suited for this
    331 specific task).
    332245
    333246Using Munin To Graph Stats Values
  • TabularUnified src/allmydata/client.py

    r07e4fe84 rd76bea4  
    44from base64 import urlsafe_b64encode
    55from functools import partial
    6 
    76# On Python 2 this will be the backported package:
    87from configparser import NoSectionError
     
    8685            "shares.needed",
    8786            "shares.total",
    88             "stats_gatherer.furl",
    8987            "storage.plugins",
    9088        ),
     
    679677
    680678    def init_stats_provider(self):
    681         gatherer_furl = self.config.get_config("client", "stats_gatherer.furl", None)
    682         if gatherer_furl:
    683             # FURLs should be bytes:
    684             gatherer_furl = gatherer_furl.encode("utf-8")
    685         self.stats_provider = StatsProvider(self, gatherer_furl)
     679        self.stats_provider = StatsProvider(self)
    686680        self.stats_provider.setServiceParent(self)
    687681        self.stats_provider.register_producer(self)
  • TabularUnified src/allmydata/immutable/checker.py

    r07e4fe84 rd76bea4  
     1"""
     2Ported to Python 3.
     3"""
     4from __future__ import absolute_import
     5from __future__ import division
     6from __future__ import print_function
     7from __future__ import unicode_literals
     8
     9from future.utils import PY2
     10if PY2:
     11    from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min  # noqa: F401
     12
    113from zope.interface import implementer
    214from twisted.internet import defer
  • TabularUnified src/allmydata/immutable/repairer.py

    r07e4fe84 rd76bea4  
     1"""
     2Ported to Python 3.
     3"""
     4from __future__ import absolute_import
     5from __future__ import division
     6from __future__ import print_function
     7from __future__ import unicode_literals
     8
     9from future.utils import PY2
     10if PY2:
     11    from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min  # noqa: F401
     12
    113from zope.interface import implementer
    214from twisted.internet import defer
  • TabularUnified src/allmydata/interfaces.py

    r07e4fe84 rd76bea4  
    29322932
    29332933
    2934 class RIStatsProvider(RemoteInterface):
    2935     __remote_name__ = native_str("RIStatsProvider.tahoe.allmydata.com")
    2936     """
    2937     Provides access to statistics and monitoring information.
    2938     """
    2939 
    2940     def get_stats():
    2941         """
    2942         returns a dictionary containing 'counters' and 'stats', each a
    2943         dictionary with string counter/stat name keys, and numeric or None values.
    2944         counters are monotonically increasing measures of work done, and
    2945         stats are instantaneous measures (potentially time averaged
    2946         internally)
    2947         """
    2948         return DictOf(bytes, DictOf(bytes, ChoiceOf(float, int, long, None)))
    2949 
    2950 
    2951 class RIStatsGatherer(RemoteInterface):
    2952     __remote_name__ = native_str("RIStatsGatherer.tahoe.allmydata.com")
    2953     """
    2954     Provides a monitoring service for centralised collection of stats
    2955     """
    2956 
    2957     def provide(provider=RIStatsProvider, nickname=bytes):
    2958         """
    2959         @param provider: a stats collector instance that should be polled
    2960                          periodically by the gatherer to collect stats.
    2961         @param nickname: a name useful to identify the provided client
    2962         """
    2963         return None
    2964 
    2965 
    29662934class IStatsProducer(Interface):
    29672935    def get_stats():
  • TabularUnified src/allmydata/introducer/client.py

    r07e4fe84 rd76bea4  
    1 from past.builtins import unicode, long
     1"""
     2Ported to Python 3.
     3"""
     4from __future__ import absolute_import
     5from __future__ import division
     6from __future__ import print_function
     7from __future__ import unicode_literals
     8
     9from future.utils import PY2
     10if PY2:
     11    from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min  # noqa: F401
     12from past.builtins import long
     13
    214from six import ensure_text
    315
     
    2840                 sequencer, cache_filepath):
    2941        self._tub = tub
    30         if isinstance(introducer_furl, unicode):
     42        if isinstance(introducer_furl, str):
    3143            introducer_furl = introducer_furl.encode("utf-8")
    3244        self.introducer_furl = introducer_furl
    3345
    34         assert type(nickname) is unicode
     46        assert isinstance(nickname, str)
    3547        self._nickname = nickname
    3648        self._my_version = my_version
     
    115127    def _save_announcements(self):
    116128        announcements = []
    117         for _, value in self._inbound_announcements.items():
     129        for value in self._inbound_announcements.values():
    118130            ann, key_s, time_stamp = value
    119131            # On Python 2, bytes strings are encoded into YAML Unicode strings.
     
    126138            announcements.append(server_params)
    127139        announcement_cache_yaml = yamlutil.safe_dump(announcements)
    128         if isinstance(announcement_cache_yaml, unicode):
     140        if isinstance(announcement_cache_yaml, str):
    129141            announcement_cache_yaml = announcement_cache_yaml.encode("utf-8")
    130142        self._cache_filepath.setContent(announcement_cache_yaml)
     
    171183        self._subscribed_service_names.add(service_name)
    172184        self._maybe_subscribe()
    173         for index,(ann,key_s,when) in self._inbound_announcements.items():
     185        for index,(ann,key_s,when) in list(self._inbound_announcements.items()):
    174186            precondition(isinstance(key_s, bytes), key_s)
    175187            servicename = index[0]
     
    216228
    217229        # publish all announcements with the new seqnum and nonce
    218         for service_name,ann_d in self._outbound_announcements.items():
     230        for service_name,ann_d in list(self._outbound_announcements.items()):
    219231            ann_d["seqnum"] = current_seqnum
    220232            ann_d["nonce"] = current_nonce
     
    228240            return
    229241        # this re-publishes everything. The Introducer ignores duplicates
    230         for ann_t in self._published_announcements.values():
     242        for ann_t in list(self._published_announcements.values()):
    231243            self._debug_counts["outbound_message"] += 1
    232244            self._debug_outstanding += 1
     
    268280        # for ASCII values, simplejson might give us unicode *or* bytes
    269281        if "nickname" in ann and isinstance(ann["nickname"], bytes):
    270             ann["nickname"] = unicode(ann["nickname"])
     282            ann["nickname"] = str(ann["nickname"])
    271283        nick_s = ann.get("nickname",u"").encode("utf-8")
    272284        lp2 = self.log(format="announcement for nickname '%(nick)s', service=%(svc)s: %(ann)s",
  • TabularUnified src/allmydata/introducer/common.py

    r07e4fe84 rd76bea4  
    1 from past.builtins import unicode
     1"""
     2Ported to Python 3.
     3"""
     4from __future__ import absolute_import
     5from __future__ import division
     6from __future__ import print_function
     7from __future__ import unicode_literals
     8
     9from future.utils import PY2
     10if PY2:
     11    from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min  # noqa: F401
    212
    313import re
     
    919def get_tubid_string_from_ann(ann):
    1020    furl = ann.get("anonymous-storage-FURL") or ann.get("FURL")
    11     if isinstance(furl, unicode):
    12         furl = furl.encode("utf-8")
    1321    return get_tubid_string(furl)
    1422
    1523def get_tubid_string(furl):
    16     m = re.match(br'pb://(\w+)@', furl)
     24    m = re.match(r'pb://(\w+)@', furl)
    1725    assert m
    18     return m.group(1).lower()
     26    return m.group(1).lower().encode("ascii")
    1927
    2028
  • TabularUnified src/allmydata/introducer/server.py

    r07e4fe84 rd76bea4  
     1"""
     2Ported to Python 3.
     3"""
     4
     5from __future__ import absolute_import
     6from __future__ import division
     7from __future__ import print_function
     8from __future__ import unicode_literals
     9
     10
     11from future.utils import PY2
     12if PY2:
     13    from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min  # noqa: F401
    114from past.builtins import long
    2 from six import ensure_str, ensure_text
     15from six import ensure_text
    316
    417import time, os.path, textwrap
     
    158171        # expected keys are: version, nickname, app-versions, my-version,
    159172        # oldest-supported
    160         self._subscribers = {}
     173        self._subscribers = dictutil.UnicodeKeyDict({})
    161174
    162175        self._debug_counts = {"inbound_message": 0,
     
    182195        """Return a list of AnnouncementDescriptor for all announcements"""
    183196        announcements = []
    184         for (index, (_, canary, ann, when)) in self._announcements.items():
     197        for (index, (_, canary, ann, when)) in list(self._announcements.items()):
    185198            ad = AnnouncementDescriptor(when, index, canary, ann)
    186199            announcements.append(ad)
     
    190203        """Return a list of SubscriberDescriptor objects for all subscribers"""
    191204        s = []
    192         for service_name, subscriptions in self._subscribers.items():
    193             for rref,(subscriber_info,when) in subscriptions.items():
     205        for service_name, subscriptions in list(self._subscribers.items()):
     206            for rref,(subscriber_info,when) in list(subscriptions.items()):
    194207                # note that if the subscriber didn't do Tub.setLocation,
    195208                # tubid will be None. Also, subscribers do not tell us which
     
    282295        self.log("introducer: subscription[%s] request at %s"
    283296                 % (service_name, subscriber), umid="U3uzLg")
    284         service_name = ensure_str(service_name)
     297        service_name = ensure_text(service_name)
    285298        subscriber_info = dictutil.UnicodeKeyDict({
    286299            ensure_text(k): v for (k, v) in subscriber_info.items()
     
    308321        subscriber.notifyOnDisconnect(_remove)
    309322
     323        # Make sure types are correct:
     324        for k in self._announcements:
     325            assert isinstance(k[0], type(service_name))
     326
    310327        # now tell them about any announcements they're interested in
    311         assert {type(service_name)}.issuperset(
    312             set(type(k[0]) for k in self._announcements)), (
    313                 service_name, self._announcements.keys()
    314         )
    315328        announcements = set( [ ann_t
    316329                               for idx,(ann_t,canary,ann,when)
  • TabularUnified src/allmydata/scripts/create_node.py

    r07e4fe84 rd76bea4  
    319319    c.write("[client]\n")
    320320    c.write("helper.furl =\n")
    321     c.write("#stats_gatherer.furl =\n")
    322321    c.write("\n")
    323322    c.write("# Encoding parameters this client will use for newly-uploaded files\n")
  • TabularUnified src/allmydata/scripts/run_common.py

    r07e4fe84 rd76bea4  
    1111from allmydata.scripts.default_nodedir import _default_nodedir
    1212from allmydata.util import fileutil
    13 from allmydata.node import read_config
    1413from allmydata.util.encodingutil import listdir_unicode, quote_local_unicode_path
    1514from allmydata.util.configutil import UnknownConfigError
     
    4847def identify_node_type(basedir):
    4948    """
    50     :return unicode: None or one of: 'client', 'introducer',
    51         'key-generator' or 'stats-gatherer'
     49    :return unicode: None or one of: 'client', 'introducer', or
     50        'key-generator'
    5251    """
    5352    tac = u''
     
    6059        return None
    6160
    62     for t in (u"client", u"introducer", u"key-generator", u"stats-gatherer"):
     61    for t in (u"client", u"introducer", u"key-generator"):
    6362        if t in tac:
    6463            return t
     
    136135                u"client": lambda: maybeDeferred(namedAny("allmydata.client.create_client"), self.basedir),
    137136                u"introducer": lambda: maybeDeferred(namedAny("allmydata.introducer.server.create_introducer"), self.basedir),
    138                 u"stats-gatherer": lambda: maybeDeferred(namedAny("allmydata.stats.StatsGathererService"), read_config(self.basedir, None), self.basedir, verbose=True),
    139137                u"key-generator": key_generator_removed,
    140138            }
  • TabularUnified src/allmydata/scripts/runner.py

    r07e4fe84 rd76bea4  
    1010from allmydata.scripts.common import get_default_nodedir
    1111from allmydata.scripts import debug, create_node, cli, \
    12     stats_gatherer, admin, tahoe_daemonize, tahoe_start, \
     12    admin, tahoe_daemonize, tahoe_start, \
    1313    tahoe_stop, tahoe_restart, tahoe_run, tahoe_invite
    1414from allmydata.util.encodingutil import quote_output, quote_local_unicode_path, get_io_encoding
     
    6161
    6262    subCommands = (     create_node.subCommands
    63                     +   stats_gatherer.subCommands
    6463                    +   admin.subCommands
    6564                    +   process_control_commands
     
    108107
    109108create_dispatch = {}
    110 for module in (create_node, stats_gatherer):
     109for module in (create_node,):
    111110    create_dispatch.update(module.dispatch)
    112111
  • TabularUnified src/allmydata/scripts/tahoe_add_alias.py

    r07e4fe84 rd76bea4  
    11from __future__ import print_function
     2from __future__ import unicode_literals
    23
    34import os.path
     
    1112from allmydata.scripts.common import get_aliases
    1213from allmydata.util.fileutil import move_into_place
    13 from allmydata.util.encodingutil import unicode_to_output, quote_output
     14from allmydata.util.encodingutil import quote_output, quote_output_u
    1415
    1516
     
    4950    old_aliases = get_aliases(nodedir)
    5051    if alias in old_aliases:
    51         print("Alias %s already exists!" % quote_output(alias), file=stderr)
     52        show_output(stderr, "Alias {alias} already exists!", alias=alias)
    5253        return 1
    5354    aliasfile = os.path.join(nodedir, "private", "aliases")
     
    5556
    5657    add_line_to_aliasfile(aliasfile, alias, cap)
    57 
    58     print("Alias %s added" % quote_output(alias), file=stdout)
     58    show_output(stdout, "Alias {alias} added", alias=alias)
    5959    return 0
    6060
     
    7676    old_aliases = get_aliases(nodedir)
    7777    if alias in old_aliases:
    78         print("Alias %s already exists!" % quote_output(alias), file=stderr)
     78        show_output(stderr, "Alias {alias} already exists!", alias=alias)
    7979        return 1
    8080
     
    9494
    9595    add_line_to_aliasfile(aliasfile, alias, new_uri)
     96    show_output(stdout, "Alias {alias} created", alias=alias)
     97    return 0
    9698
    97     print("Alias %s created" % (quote_output(alias),), file=stdout)
    98     return 0
     99
     100def show_output(fp, template, **kwargs):
     101    """
     102    Print to just about anything.
     103
     104    :param fp: A file-like object to which to print.  This handles the case
     105        where ``fp`` declares a support encoding with the ``encoding``
     106        attribute (eg sys.stdout on Python 3).  It handles the case where
     107        ``fp`` declares no supported encoding via ``None`` for its
     108        ``encoding`` attribute (eg sys.stdout on Python 2 when stdout is not a
     109        tty).  It handles the case where ``fp`` declares an encoding that does
     110        not support all of the characters in the output by forcing the
     111        "namereplace" error handler.  It handles the case where there is no
     112        ``encoding`` attribute at all (eg StringIO.StringIO) by writing
     113        utf-8-encoded bytes.
     114    """
     115    assert isinstance(template, unicode)
     116
     117    # On Python 3 fp has an encoding attribute under all real usage.  On
     118    # Python 2, the encoding attribute is None if stdio is not a tty.  The
     119    # test suite often passes StringIO which has no such attribute.  Make
     120    # allowances for this until the test suite is fixed and Python 2 is no
     121    # more.
     122    try:
     123        encoding = fp.encoding or "utf-8"
     124    except AttributeError:
     125        has_encoding = False
     126        encoding = "utf-8"
     127    else:
     128        has_encoding = True
     129
     130    output = template.format(**{
     131        k: quote_output_u(v, encoding=encoding)
     132        for (k, v)
     133        in kwargs.items()
     134    })
     135    safe_output = output.encode(encoding, "namereplace")
     136    if has_encoding:
     137        safe_output = safe_output.decode(encoding)
     138    print(safe_output, file=fp)
    99139
    100140
     
    112152
    113153
     154def _escape_format(t):
     155    """
     156    _escape_format(t).format() == t
     157
     158    :param unicode t: The text to escape.
     159    """
     160    return t.replace("{", "{{").replace("}", "}}")
     161
     162
    114163def list_aliases(options):
    115     nodedir = options['node-directory']
    116     stdout = options.stdout
    117     stderr = options.stderr
    118 
    119     data = _get_alias_details(nodedir)
    120 
    121     max_width = max([len(quote_output(name)) for name in data.keys()] + [0])
    122     fmt = "%" + str(max_width) + "s: %s"
    123     rc = 0
     164    """
     165    Show aliases that exist.
     166    """
     167    data = _get_alias_details(options['node-directory'])
    124168
    125169    if options['json']:
    126         try:
    127             # XXX why are we presuming utf-8 output?
    128             print(json.dumps(data, indent=4).decode('utf-8'), file=stdout)
    129         except (UnicodeEncodeError, UnicodeDecodeError):
    130             print(json.dumps(data, indent=4), file=stderr)
    131             rc = 1
     170        output = _escape_format(json.dumps(data, indent=4).decode("ascii"))
    132171    else:
    133         for name, details in data.items():
    134             dircap = details['readonly'] if options['readonly-uri'] else details['readwrite']
    135             try:
    136                 print(fmt % (unicode_to_output(name), unicode_to_output(dircap.decode('utf-8'))), file=stdout)
    137             except (UnicodeEncodeError, UnicodeDecodeError):
    138                 print(fmt % (quote_output(name), quote_output(dircap)), file=stderr)
    139                 rc = 1
     172        def dircap(details):
     173            return (
     174                details['readonly']
     175                if options['readonly-uri']
     176                else details['readwrite']
     177            ).decode("utf-8")
    140178
    141     if rc == 1:
    142         print("\nThis listing included aliases or caps that could not be converted to the terminal" \
    143                         "\noutput encoding. These are shown using backslash escapes and in quotes.", file=stderr)
    144     return rc
     179        def format_dircap(name, details):
     180            return fmt % (name, dircap(details))
     181
     182        max_width = max([len(quote_output(name)) for name in data.keys()] + [0])
     183        fmt = "%" + str(max_width) + "s: %s"
     184        output = "\n".join(list(
     185            format_dircap(name, details)
     186            for name, details
     187            in data.items()
     188        ))
     189
     190    if output:
     191        # Show whatever we computed.  Skip this if there is no output to avoid
     192        # a spurious blank line.
     193        show_output(options.stdout, output)
     194
     195    return 0
  • TabularUnified src/allmydata/stats.py

    r07e4fe84 rd76bea4  
    11from __future__ import print_function
    22
    3 import json
    4 import os
    5 import pprint
    63import time
    7 from collections import deque
    84
    95# Python 2 compatibility
     
    128    from future.builtins import str  # noqa: F401
    139
    14 from twisted.internet import reactor
    1510from twisted.application import service
    1611from twisted.application.internet import TimerService
    1712from zope.interface import implementer
    18 from foolscap.api import eventually, DeadReferenceError, Referenceable, Tub
     13from foolscap.api import eventually
    1914
    2015from allmydata.util import log
    21 from allmydata.util.encodingutil import quote_local_unicode_path
    22 from allmydata.interfaces import RIStatsProvider, RIStatsGatherer, IStatsProducer
    23 
    24 @implementer(IStatsProducer)
    25 class LoadMonitor(service.MultiService):
    26 
    27     loop_interval = 1
    28     num_samples = 60
    29 
    30     def __init__(self, provider, warn_if_delay_exceeds=1):
    31         service.MultiService.__init__(self)
    32         self.provider = provider
    33         self.warn_if_delay_exceeds = warn_if_delay_exceeds
    34         self.started = False
    35         self.last = None
    36         self.stats = deque()
    37         self.timer = None
    38 
    39     def startService(self):
    40         if not self.started:
    41             self.started = True
    42             self.timer = reactor.callLater(self.loop_interval, self.loop)
    43         service.MultiService.startService(self)
    44 
    45     def stopService(self):
    46         self.started = False
    47         if self.timer:
    48             self.timer.cancel()
    49             self.timer = None
    50         return service.MultiService.stopService(self)
    51 
    52     def loop(self):
    53         self.timer = None
    54         if not self.started:
    55             return
    56         now = time.time()
    57         if self.last is not None:
    58             delay = now - self.last - self.loop_interval
    59             if delay > self.warn_if_delay_exceeds:
    60                 log.msg(format='excessive reactor delay (%ss)', args=(delay,),
    61                         level=log.UNUSUAL)
    62             self.stats.append(delay)
    63             while len(self.stats) > self.num_samples:
    64                 self.stats.popleft()
    65 
    66         self.last = now
    67         self.timer = reactor.callLater(self.loop_interval, self.loop)
    68 
    69     def get_stats(self):
    70         if self.stats:
    71             avg = sum(self.stats) / len(self.stats)
    72             m_x = max(self.stats)
    73         else:
    74             avg = m_x = 0
    75         return { 'load_monitor.avg_load': avg,
    76                  'load_monitor.max_load': m_x, }
     16from allmydata.interfaces import IStatsProducer
    7717
    7818@implementer(IStatsProducer)
     
    12969
    13070
    131 @implementer(RIStatsProvider)
    132 class StatsProvider(Referenceable, service.MultiService):
     71class StatsProvider(service.MultiService):
    13372
    134     def __init__(self, node, gatherer_furl):
     73    def __init__(self, node):
    13574        service.MultiService.__init__(self)
    13675        self.node = node
    137         self.gatherer_furl = gatherer_furl # might be None
    13876
    13977        self.counters = {}
    14078        self.stats_producers = []
    141 
    142         # only run the LoadMonitor (which submits a timer every second) if
    143         # there is a gatherer who is going to be paying attention. Our stats
    144         # are visible through HTTP even without a gatherer, so run the rest
    145         # of the stats (including the once-per-minute CPUUsageMonitor)
    146         if gatherer_furl:
    147             self.load_monitor = LoadMonitor(self)
    148             self.load_monitor.setServiceParent(self)
    149             self.register_producer(self.load_monitor)
    150 
    15179        self.cpu_monitor = CPUUsageMonitor()
    15280        self.cpu_monitor.setServiceParent(self)
    15381        self.register_producer(self.cpu_monitor)
    154 
    155     def startService(self):
    156         if self.node and self.gatherer_furl:
    157             nickname_utf8 = self.node.nickname.encode("utf-8")
    158             self.node.tub.connectTo(self.gatherer_furl,
    159                                     self._connected, nickname_utf8)
    160         service.MultiService.startService(self)
    16182
    16283    def count(self, name, delta=1):
     
    17697        log.msg(format='get_stats() -> %(stats)s', stats=ret, level=log.NOISY)
    17798        return ret
    178 
    179     def remote_get_stats(self):
    180         # The remote API expects keys to be bytes:
    181         def to_bytes(d):
    182             result = {}
    183             for (k, v) in d.items():
    184                 if isinstance(k, str):
    185                     k = k.encode("utf-8")
    186                 result[k] = v
    187             return result
    188 
    189         stats = self.get_stats()
    190         return {b"counters": to_bytes(stats["counters"]),
    191                 b"stats": to_bytes(stats["stats"])}
    192 
    193     def _connected(self, gatherer, nickname):
    194         gatherer.callRemoteOnly('provide', self, nickname or '')
    195 
    196 
    197 @implementer(RIStatsGatherer)
    198 class StatsGatherer(Referenceable, service.MultiService):
    199 
    200     poll_interval = 60
    201 
    202     def __init__(self, basedir):
    203         service.MultiService.__init__(self)
    204         self.basedir = basedir
    205 
    206         self.clients = {}
    207         self.nicknames = {}
    208 
    209         self.timer = TimerService(self.poll_interval, self.poll)
    210         self.timer.setServiceParent(self)
    211 
    212     def get_tubid(self, rref):
    213         return rref.getRemoteTubID()
    214 
    215     def remote_provide(self, provider, nickname):
    216         tubid = self.get_tubid(provider)
    217         if tubid == '<unauth>':
    218             print("WARNING: failed to get tubid for %s (%s)" % (provider, nickname))
    219             # don't add to clients to poll (polluting data) don't care about disconnect
    220             return
    221         self.clients[tubid] = provider
    222         self.nicknames[tubid] = nickname
    223 
    224     def poll(self):
    225         for tubid,client in self.clients.items():
    226             nickname = self.nicknames.get(tubid)
    227             d = client.callRemote('get_stats')
    228             d.addCallbacks(self.got_stats, self.lost_client,
    229                            callbackArgs=(tubid, nickname),
    230                            errbackArgs=(tubid,))
    231             d.addErrback(self.log_client_error, tubid)
    232 
    233     def lost_client(self, f, tubid):
    234         # this is called lazily, when a get_stats request fails
    235         del self.clients[tubid]
    236         del self.nicknames[tubid]
    237         f.trap(DeadReferenceError)
    238 
    239     def log_client_error(self, f, tubid):
    240         log.msg("StatsGatherer: error in get_stats(), peerid=%s" % tubid,
    241                 level=log.UNUSUAL, failure=f)
    242 
    243     def got_stats(self, stats, tubid, nickname):
    244         raise NotImplementedError()
    245 
    246 class StdOutStatsGatherer(StatsGatherer):
    247     verbose = True
    248     def remote_provide(self, provider, nickname):
    249         tubid = self.get_tubid(provider)
    250         if self.verbose:
    251             print('connect "%s" [%s]' % (nickname, tubid))
    252             provider.notifyOnDisconnect(self.announce_lost_client, tubid)
    253         StatsGatherer.remote_provide(self, provider, nickname)
    254 
    255     def announce_lost_client(self, tubid):
    256         print('disconnect "%s" [%s]' % (self.nicknames[tubid], tubid))
    257 
    258     def got_stats(self, stats, tubid, nickname):
    259         print('"%s" [%s]:' % (nickname, tubid))
    260         pprint.pprint(stats)
    261 
    262 class JSONStatsGatherer(StdOutStatsGatherer):
    263     # inherit from StdOutStatsGatherer for connect/disconnect notifications
    264 
    265     def __init__(self, basedir=u".", verbose=True):
    266         self.verbose = verbose
    267         StatsGatherer.__init__(self, basedir)
    268         self.jsonfile = os.path.join(basedir, "stats.json")
    269 
    270         if os.path.exists(self.jsonfile):
    271             try:
    272                 with open(self.jsonfile, 'rb') as f:
    273                     self.gathered_stats = json.load(f)
    274             except Exception:
    275                 print("Error while attempting to load stats file %s.\n"
    276                       "You may need to restore this file from a backup,"
    277                       " or delete it if no backup is available.\n" %
    278                       quote_local_unicode_path(self.jsonfile))
    279                 raise
    280         else:
    281             self.gathered_stats = {}
    282 
    283     def got_stats(self, stats, tubid, nickname):
    284         s = self.gathered_stats.setdefault(tubid, {})
    285         s['timestamp'] = time.time()
    286         s['nickname'] = nickname
    287         s['stats'] = stats
    288         self.dump_json()
    289 
    290     def dump_json(self):
    291         tmp = "%s.tmp" % (self.jsonfile,)
    292         with open(tmp, 'wb') as f:
    293             json.dump(self.gathered_stats, f)
    294         if os.path.exists(self.jsonfile):
    295             os.unlink(self.jsonfile)
    296         os.rename(tmp, self.jsonfile)
    297 
    298 class StatsGathererService(service.MultiService):
    299     furl_file = "stats_gatherer.furl"
    300 
    301     def __init__(self, basedir=".", verbose=False):
    302         service.MultiService.__init__(self)
    303         self.basedir = basedir
    304         self.tub = Tub(certFile=os.path.join(self.basedir,
    305                                              "stats_gatherer.pem"))
    306         self.tub.setServiceParent(self)
    307         self.tub.setOption("logLocalFailures", True)
    308         self.tub.setOption("logRemoteFailures", True)
    309         self.tub.setOption("expose-remote-exception-types", False)
    310 
    311         self.stats_gatherer = JSONStatsGatherer(self.basedir, verbose)
    312         self.stats_gatherer.setServiceParent(self)
    313 
    314         try:
    315             with open(os.path.join(self.basedir, "location")) as f:
    316                 location = f.read().strip()
    317         except EnvironmentError:
    318             raise ValueError("Unable to find 'location' in BASEDIR, please rebuild your stats-gatherer")
    319         try:
    320             with open(os.path.join(self.basedir, "port")) as f:
    321                 port = f.read().strip()
    322         except EnvironmentError:
    323             raise ValueError("Unable to find 'port' in BASEDIR, please rebuild your stats-gatherer")
    324 
    325         self.tub.listenOn(port)
    326         self.tub.setLocation(location)
    327         ff = os.path.join(self.basedir, self.furl_file)
    328         self.gatherer_furl = self.tub.registerReference(self.stats_gatherer,
    329                                                         furlFile=ff)
  • TabularUnified src/allmydata/storage_client.py

    r07e4fe84 rd76bea4  
    562562
    563563        *nickname* is optional.
    564         """
     564
     565        The furl will be a Unicode string on Python 3; on Python 2 it will be
     566        either a native (bytes) string or a Unicode string.
     567        """
     568        furl = furl.encode("utf-8")
    565569        m = re.match(br'pb://(\w+)@', furl)
    566570        assert m, furl
     
    757761            return _FoolscapStorage.from_announcement(
    758762                self._server_id,
    759                 furl.encode("utf-8"),
     763                furl,
    760764                ann,
    761765                storage_server,
     
    769773            pass
    770774        else:
    771             if isinstance(furl, str):
    772                 furl = furl.encode("utf-8")
    773775            # See comment above for the _storage_from_foolscap_plugin case
    774776            # about passing in get_rref.
  • TabularUnified src/allmydata/test/cli/common.py

    r07e4fe84 rd76bea4  
    11from ...util.encodingutil import unicode_to_argv
    22from ...scripts import runner
    3 from ..common_util import ReallyEqualMixin, run_cli
     3from ..common_util import ReallyEqualMixin, run_cli, run_cli_unicode
    44
    55def parse_options(basedir, command, args):
     
    1111
    1212class CLITestMixin(ReallyEqualMixin):
    13     def do_cli(self, verb, *args, **kwargs):
     13    """
     14    A mixin for use with ``GridTestMixin`` to execute CLI commands against
     15    nodes created by methods of that mixin.
     16    """
     17    def do_cli_unicode(self, verb, argv, client_num=0, **kwargs):
     18        """
     19        Run a Tahoe-LAFS CLI command.
     20
     21        :param verb: See ``run_cli_unicode``.
     22
     23        :param argv: See ``run_cli_unicode``.
     24
     25        :param int client_num: The number of the ``GridTestMixin``-created
     26            node against which to execute the command.
     27
     28        :param kwargs: Additional keyword arguments to pass to
     29            ``run_cli_unicode``.
     30        """
    1431        # client_num is used to execute client CLI commands on a specific
    1532        # client.
    16         client_num = kwargs.get("client_num", 0)
     33        client_dir = self.get_clientdir(i=client_num)
     34        nodeargs = [ u"--node-directory", client_dir ]
     35        return run_cli_unicode(verb, argv, nodeargs=nodeargs, **kwargs)
     36
     37
     38    def do_cli(self, verb, *args, **kwargs):
     39        """
     40        Like ``do_cli_unicode`` but work with ``bytes`` everywhere instead of
     41        ``unicode``.
     42
     43        Where possible, prefer ``do_cli_unicode``.
     44        """
     45        # client_num is used to execute client CLI commands on a specific
     46        # client.
     47        client_num = kwargs.pop("client_num", 0)
    1748        client_dir = unicode_to_argv(self.get_clientdir(i=client_num))
    18         nodeargs = [ "--node-directory", client_dir ]
    19         return run_cli(verb, nodeargs=nodeargs, *args, **kwargs)
     49        nodeargs = [ b"--node-directory", client_dir ]
     50        return run_cli(verb, *args, nodeargs=nodeargs, **kwargs)
  • TabularUnified src/allmydata/test/cli/test_alias.py

    r07e4fe84 rd76bea4  
    11import json
    2 from mock import patch
    32
    43from twisted.trial import unittest
    54from twisted.internet.defer import inlineCallbacks
    65
    7 from allmydata.util.encodingutil import unicode_to_argv
    86from allmydata.scripts.common import get_aliases
    97from allmydata.test.no_network import GridTestMixin
    108from .common import CLITestMixin
    11 from ..common_util import skip_if_cannot_represent_argv
     9from allmydata.util import encodingutil
    1210
    1311# see also test_create_alias
     
    1614
    1715    @inlineCallbacks
    18     def test_list(self):
    19         self.basedir = "cli/ListAlias/test_list"
     16    def _check_create_alias(self, alias, encoding):
     17        """
     18        Verify that ``tahoe create-alias`` can be used to create an alias named
     19        ``alias`` when argv is encoded using ``encoding``.
     20
     21        :param unicode alias: The alias to try to create.
     22
     23        :param NoneType|str encoding: The name of an encoding to force the
     24            ``create-alias`` implementation to use.  This simulates the
     25            effects of setting LANG and doing other locale-foolishness without
     26            actually having to mess with this process's global locale state.
     27            If this is ``None`` then the encoding used will be ascii but the
     28            stdio objects given to the code under test will not declare any
     29            encoding (this is like Python 2 when stdio is not a tty).
     30
     31        :return Deferred: A Deferred that fires with success if the alias can
     32            be created and that creation is reported on stdout appropriately
     33            encoded or with failure if something goes wrong.
     34        """
     35        self.basedir = self.mktemp()
    2036        self.set_up_grid(oneshare=True)
    2137
    22         rc, stdout, stderr = yield self.do_cli(
    23             "create-alias",
    24             unicode_to_argv(u"tahoe"),
     38        # We can pass an encoding into the test utilities to invoke the code
     39        # under test but we can't pass such a parameter directly to the code
     40        # under test.  Instead, that code looks at io_encoding.  So,
     41        # monkey-patch that value to our desired value here.  This is the code
     42        # that most directly takes the place of messing with LANG or the
     43        # locale module.
     44        self.patch(encodingutil, "io_encoding", encoding or "ascii")
     45
     46        rc, stdout, stderr = yield self.do_cli_unicode(
     47            u"create-alias",
     48            [alias],
     49            encoding=encoding,
    2550        )
    2651
    27         self.failUnless(unicode_to_argv(u"Alias 'tahoe' created") in stdout)
    28         self.failIf(stderr)
     52        # Make sure the result of the create-alias command is as we want it to
     53        # be.
     54        self.assertEqual(u"Alias '{}' created\n".format(alias), stdout)
     55        self.assertEqual("", stderr)
     56        self.assertEqual(0, rc)
     57
     58        # Make sure it had the intended side-effect, too - an alias created in
     59        # the node filesystem state.
    2960        aliases = get_aliases(self.get_clientdir())
    30         self.failUnless(u"tahoe" in aliases)
    31         self.failUnless(aliases[u"tahoe"].startswith("URI:DIR2:"))
     61        self.assertIn(alias, aliases)
     62        self.assertTrue(aliases[alias].startswith(u"URI:DIR2:"))
    3263
    33         rc, stdout, stderr = yield self.do_cli("list-aliases", "--json")
     64        # And inspect the state via the user interface list-aliases command
     65        # too.
     66        rc, stdout, stderr = yield self.do_cli_unicode(
     67            u"list-aliases",
     68            [u"--json"],
     69            encoding=encoding,
     70        )
    3471
    3572        self.assertEqual(0, rc)
    3673        data = json.loads(stdout)
    37         self.assertIn(u"tahoe", data)
    38         data = data[u"tahoe"]
    39         self.assertIn("readwrite", data)
    40         self.assertIn("readonly", data)
     74        self.assertIn(alias, data)
     75        data = data[alias]
     76        self.assertIn(u"readwrite", data)
     77        self.assertIn(u"readonly", data)
    4178
    42     @inlineCallbacks
    43     def test_list_unicode_mismatch_json(self):
     79
     80    def test_list_none(self):
    4481        """
    45         pretty hack-y test, but we want to cover the 'except' on Unicode
    46         errors paths and I can't come up with a nicer way to trigger
    47         this
     82        An alias composed of all ASCII-encodeable code points can be created when
     83        stdio aren't clearly marked with an encoding.
    4884        """
    49         self.basedir = "cli/ListAlias/test_list_unicode_mismatch_json"
    50         skip_if_cannot_represent_argv(u"tahoe\u263A")
    51         self.set_up_grid(oneshare=True)
    52 
    53         rc, stdout, stderr = yield self.do_cli(
    54             "create-alias",
    55             unicode_to_argv(u"tahoe\u263A"),
     85        return self._check_create_alias(
     86            u"tahoe",
     87            encoding=None,
    5688        )
    5789
    58         self.failUnless(unicode_to_argv(u"Alias 'tahoe\u263A' created") in stdout)
    59         self.failIf(stderr)
    6090
    61         booms = []
    62 
    63         def boom(out, indent=4):
    64             if not len(booms):
    65                 booms.append(out)
    66                 raise UnicodeEncodeError("foo", u"foo", 3, 5, "foo")
    67             return str(out)
    68 
    69         with patch("allmydata.scripts.tahoe_add_alias.json.dumps", boom):
    70             aliases = get_aliases(self.get_clientdir())
    71             self.failUnless(u"tahoe\u263A" in aliases)
    72             self.failUnless(aliases[u"tahoe\u263A"].startswith("URI:DIR2:"))
    73 
    74             rc, stdout, stderr = yield self.do_cli("list-aliases", "--json")
    75 
    76             self.assertEqual(1, rc)
    77             self.assertIn("could not be converted", stderr)
    78 
    79     @inlineCallbacks
    80     def test_list_unicode_mismatch(self):
    81         self.basedir = "cli/ListAlias/test_list_unicode_mismatch"
    82         skip_if_cannot_represent_argv(u"tahoe\u263A")
    83         self.set_up_grid(oneshare=True)
    84 
    85         rc, stdout, stderr = yield self.do_cli(
    86             "create-alias",
    87             unicode_to_argv(u"tahoe\u263A"),
     91    def test_list_ascii(self):
     92        """
     93        An alias composed of all ASCII-encodeable code points can be created when
     94        the active encoding is ASCII.
     95        """
     96        return self._check_create_alias(
     97            u"tahoe",
     98            encoding="ascii",
    8899        )
    89100
    90         def boom(out):
    91             print("boom {}".format(out))
    92             return out
    93             raise UnicodeEncodeError("foo", u"foo", 3, 5, "foo")
    94101
    95         with patch("allmydata.scripts.tahoe_add_alias.unicode_to_output", boom):
    96             self.failUnless(unicode_to_argv(u"Alias 'tahoe\u263A' created") in stdout)
    97             self.failIf(stderr)
    98             aliases = get_aliases(self.get_clientdir())
    99             self.failUnless(u"tahoe\u263A" in aliases)
    100             self.failUnless(aliases[u"tahoe\u263A"].startswith("URI:DIR2:"))
     102    def test_list_latin_1(self):
     103        """
     104        An alias composed of all Latin-1-encodeable code points can be created
     105        when the active encoding is Latin-1.
    101106
    102             rc, stdout, stderr = yield self.do_cli("list-aliases")
     107        This is very similar to ``test_list_utf_8`` but the assumption of
     108        UTF-8 is nearly ubiquitous and explicitly exercising the codepaths
     109        with a UTF-8-incompatible encoding helps flush out unintentional UTF-8
     110        assumptions.
     111        """
     112        return self._check_create_alias(
     113            u"taho\N{LATIN SMALL LETTER E WITH ACUTE}",
     114            encoding="latin-1",
     115        )
    103116
    104             self.assertEqual(1, rc)
    105             self.assertIn("could not be converted", stderr)
     117
     118    def test_list_utf_8(self):
     119        """
     120        An alias composed of all UTF-8-encodeable code points can be created when
     121        the active encoding is UTF-8.
     122        """
     123        return self._check_create_alias(
     124            u"tahoe\N{SNOWMAN}",
     125            encoding="utf-8",
     126        )
  • TabularUnified src/allmydata/test/cli/test_cp.py

    r07e4fe84 rd76bea4  
    662662        # a local directory without a specified file name.
    663663        # https://tahoe-lafs.org/trac/tahoe-lafs/ticket/2027
    664         self.basedir = "cli/Cp/cp_verbose"
     664        self.basedir = "cli/Cp/ticket_2027"
    665665        self.set_up_grid(oneshare=True)
    666666
  • TabularUnified src/allmydata/test/common.py

    r07e4fe84 rd76bea4  
    1111    "skipIf",
    1212]
     13
     14from past.builtins import chr as byteschr
    1315
    1416import os, random, struct
     
    215217    :ivar FilePath basedir: The base directory of the node.
    216218
    217     :ivar bytes introducer_furl: The introducer furl with which to
     219    :ivar str introducer_furl: The introducer furl with which to
    218220        configure the client.
    219221
     
    226228    storage_plugin = attr.ib()
    227229    basedir = attr.ib(validator=attr.validators.instance_of(FilePath))
    228     introducer_furl = attr.ib(validator=attr.validators.instance_of(bytes))
     230    introducer_furl = attr.ib(validator=attr.validators.instance_of(str),
     231                              converter=six.ensure_str)
    229232    node_config = attr.ib(default=attr.Factory(dict))
    230233
     
    10571060        offset = 0x0c+0x44+sharedatasize-1
    10581061
    1059     newdata = data[:offset] + chr(ord(data[offset])^0xFF) + data[offset+1:]
     1062    newdata = data[:offset] + byteschr(ord(data[offset:offset+1])^0xFF) + data[offset+1:]
    10601063    if debug:
    10611064        log.msg("testing: flipping all bits of byte at offset %d: %r, newdata: %r" % (offset, data[offset], newdata[offset]))
     
    10851088    if debug:
    10861089        log.msg("original data: %r" % (data,))
    1087     return data[:0x0c+0x221] + chr(ord(data[0x0c+0x221])^0x02) + data[0x0c+0x2210+1:]
     1090    return data[:0x0c+0x221] + byteschr(ord(data[0x0c+0x221:0x0c+0x221+1])^0x02) + data[0x0c+0x2210+1:]
    10881091
    10891092def _corrupt_block_hashes(data, debug=False):
  • TabularUnified src/allmydata/test/common_util.py

    r07e4fe84 rd76bea4  
    66from random import randrange
    77from six.moves import StringIO
     8from io import (
     9    TextIOWrapper,
     10    BytesIO,
     11)
    812
    913from twisted.internet import reactor, defer
     
    3640        raise unittest.SkipTest("A non-ASCII argv could not be encoded on this platform.")
    3741
    38 def run_cli(verb, *args, **kwargs):
    39     precondition(not [True for arg in args if not isinstance(arg, str)],
    40                  "arguments to do_cli must be strs -- convert using unicode_to_argv", args=args)
    41     nodeargs = kwargs.get("nodeargs", [])
     42
     43def _getvalue(io):
     44    """
     45    Read out the complete contents of a file-like object.
     46    """
     47    io.seek(0)
     48    return io.read()
     49
     50
     51def run_cli_bytes(verb, *args, **kwargs):
     52    """
     53    Run a Tahoe-LAFS CLI command specified as bytes.
     54
     55    Most code should prefer ``run_cli_unicode`` which deals with all the
     56    necessary encoding considerations.  This helper still exists so that novel
     57    misconfigurations can be explicitly tested (for example, receiving UTF-8
     58    bytes when the system encoding claims to be ASCII).
     59
     60    :param bytes verb: The command to run.  For example, ``b"create-node"``.
     61
     62    :param [bytes] args: The arguments to pass to the command.  For example,
     63        ``(b"--hostname=localhost",)``.
     64
     65    :param [bytes] nodeargs: Extra arguments to pass to the Tahoe executable
     66        before ``verb``.
     67
     68    :param bytes stdin: Text to pass to the command via stdin.
     69
     70    :param NoneType|str encoding: The name of an encoding which stdout and
     71        stderr will be configured to use.  ``None`` means stdout and stderr
     72        will accept bytes and unicode and use the default system encoding for
     73        translating between them.
     74    """
     75    nodeargs = kwargs.pop("nodeargs", [])
     76    encoding = kwargs.pop("encoding", None)
     77    precondition(
     78        all(isinstance(arg, bytes) for arg in [verb] + nodeargs + list(args)),
     79        "arguments to run_cli must be bytes -- convert using unicode_to_argv",
     80        verb=verb,
     81        args=args,
     82        nodeargs=nodeargs,
     83    )
    4284    argv = nodeargs + [verb] + list(args)
    4385    stdin = kwargs.get("stdin", "")
    44     stdout = StringIO()
    45     stderr = StringIO()
     86    if encoding is None:
     87        # The original behavior, the Python 2 behavior, is to accept either
     88        # bytes or unicode and try to automatically encode or decode as
     89        # necessary.  This works okay for ASCII and if LANG is set
     90        # appropriately.  These aren't great constraints so we should move
     91        # away from this behavior.
     92        stdout = StringIO()
     93        stderr = StringIO()
     94    else:
     95        # The new behavior, the Python 3 behavior, is to accept unicode and
     96        # encode it using a specific encoding.  For older versions of Python
     97        # 3, the encoding is determined from LANG (bad) but for newer Python
     98        # 3, the encoding is always utf-8 (good).  Tests can pass in different
     99        # encodings to exercise different behaviors.
     100        stdout = TextIOWrapper(BytesIO(), encoding)
     101        stderr = TextIOWrapper(BytesIO(), encoding)
    46102    d = defer.succeed(argv)
    47103    d.addCallback(runner.parse_or_exit_with_explanation, stdout=stdout)
     
    50106                  stdout=stdout, stderr=stderr)
    51107    def _done(rc):
    52         return 0, stdout.getvalue(), stderr.getvalue()
     108        return 0, _getvalue(stdout), _getvalue(stderr)
    53109    def _err(f):
    54110        f.trap(SystemExit)
    55         return f.value.code, stdout.getvalue(), stderr.getvalue()
     111        return f.value.code, _getvalue(stdout), _getvalue(stderr)
    56112    d.addCallbacks(_done, _err)
    57113    return d
     114
     115
     116def run_cli_unicode(verb, argv, nodeargs=None, stdin=None, encoding=None):
     117    """
     118    Run a Tahoe-LAFS CLI command.
     119
     120    :param unicode verb: The command to run.  For example, ``u"create-node"``.
     121
     122    :param [unicode] argv: The arguments to pass to the command.  For example,
     123        ``[u"--hostname=localhost"]``.
     124
     125    :param [unicode] nodeargs: Extra arguments to pass to the Tahoe executable
     126        before ``verb``.
     127
     128    :param unicode stdin: Text to pass to the command via stdin.
     129
     130    :param NoneType|str encoding: The name of an encoding to use for all
     131        bytes/unicode conversions necessary *and* the encoding to cause stdio
     132        to declare with its ``encoding`` attribute.  ``None`` means ASCII will
     133        be used and no declaration will be made at all.
     134    """
     135    if nodeargs is None:
     136        nodeargs = []
     137    precondition(
     138        all(isinstance(arg, unicode) for arg in [verb] + nodeargs + argv),
     139        "arguments to run_cli_unicode must be unicode",
     140        verb=verb,
     141        nodeargs=nodeargs,
     142        argv=argv,
     143    )
     144    codec = encoding or "ascii"
     145    encode = lambda t: None if t is None else t.encode(codec)
     146    d = run_cli_bytes(
     147        encode(verb),
     148        nodeargs=list(encode(arg) for arg in nodeargs),
     149        stdin=encode(stdin),
     150        encoding=encoding,
     151        *list(encode(arg) for arg in argv)
     152    )
     153    def maybe_decode(result):
     154        code, stdout, stderr = result
     155        if isinstance(stdout, bytes):
     156            stdout = stdout.decode(codec)
     157        if isinstance(stderr, bytes):
     158            stderr = stderr.decode(codec)
     159        return code, stdout, stderr
     160    d.addCallback(maybe_decode)
     161    return d
     162
     163
     164run_cli = run_cli_bytes
     165
    58166
    59167def parse_cli(*argv):
  • TabularUnified src/allmydata/test/mutable/util.py

    r07e4fe84 rd76bea4  
    240240    fss = FakeStorageServer(peerid, s)
    241241    ann = {
    242         "anonymous-storage-FURL": b"pb://%s@nowhere/fake" % (peerid,),
     242        "anonymous-storage-FURL": "pb://%s@nowhere/fake" % (str(peerid, "utf-8"),),
    243243        "permutation-seed-base32": peerid,
    244244    }
  • TabularUnified src/allmydata/test/test_checker.py

    r07e4fe84 rd76bea4  
    157157            server_id = key_s
    158158            tubid_b32 = base32.b2a(binary_tubid)
    159             furl = b"pb://%s@nowhere/fake" % tubid_b32
     159            furl = "pb://%s@nowhere/fake" % str(tubid_b32, "utf-8")
    160160            ann = { "version": 0,
    161161                    "service-name": "storage",
  • TabularUnified src/allmydata/test/test_client.py

    r07e4fe84 rd76bea4  
    8989)
    9090
    91 SOME_FURL = b"pb://abcde@nowhere/fake"
     91SOME_FURL = "pb://abcde@nowhere/fake"
    9292
    9393BASECONFIG = "[client]\n"
  • TabularUnified src/allmydata/test/test_introducer.py

    r07e4fe84 rd76bea4  
    217217            announcements.append( (key_s, ann) )
    218218        ic1.subscribe_to("storage", _received)
    219         furl1 = b"pb://62ubehyunnyhzs7r6vdonnm2hpi52w6y@127.0.0.1:36106/gydnp"
    220         furl1a = b"pb://62ubehyunnyhzs7r6vdonnm2hpi52w6y@127.0.0.1:7777/gydnp"
    221         furl2 = b"pb://ttwwooyunnyhzs7r6vdonnm2hpi52w6y@127.0.0.1:36106/ttwwoo"
     219        furl1 = "pb://62ubehyunnyhzs7r6vdonnm2hpi52w6y@127.0.0.1:36106/gydnp"
     220        furl1a = "pb://62ubehyunnyhzs7r6vdonnm2hpi52w6y@127.0.0.1:7777/gydnp"
     221        furl2 = "pb://ttwwooyunnyhzs7r6vdonnm2hpi52w6y@127.0.0.1:36106/ttwwoo"
    222222
    223223        private_key, public_key = ed25519.create_signing_keypair()
     
    243243            key_s,ann = announcements[0]
    244244            self.failUnlessEqual(key_s, pubkey_s)
    245             self.failUnlessEqual(ensure_binary(ann["anonymous-storage-FURL"]), furl1)
     245            self.failUnlessEqual(ann["anonymous-storage-FURL"], furl1)
    246246            self.failUnlessEqual(ann["my-version"], "ver23")
    247247        d.addCallback(_then1)
     
    277277            key_s,ann = announcements[-1]
    278278            self.failUnlessEqual(key_s, pubkey_s)
    279             self.failUnlessEqual(ensure_binary(ann["anonymous-storage-FURL"]), furl1)
     279            self.failUnlessEqual(ann["anonymous-storage-FURL"], furl1)
    280280            self.failUnlessEqual(ann["my-version"], "ver24")
    281281        d.addCallback(_then3)
     
    289289            key_s,ann = announcements[-1]
    290290            self.failUnlessEqual(key_s, pubkey_s)
    291             self.failUnlessEqual(ensure_binary(ann["anonymous-storage-FURL"]), furl1a)
     291            self.failUnlessEqual(ann["anonymous-storage-FURL"], furl1a)
    292292            self.failUnlessEqual(ann["my-version"], "ver23")
    293293        d.addCallback(_then4)
     
    305305            key_s,ann = announcements2[-1]
    306306            self.failUnlessEqual(key_s, pubkey_s)
    307             self.failUnlessEqual(ensure_binary(ann["anonymous-storage-FURL"]), furl1a)
     307            self.failUnlessEqual(ann["anonymous-storage-FURL"], furl1a)
    308308            self.failUnlessEqual(ann["my-version"], "ver23")
    309309        d.addCallback(_then5)
     
    317317                               "ver23", "oldest_version", realseq,
    318318                               FilePath(self.mktemp()))
    319         furl1 = b"pb://62ubehyunnyhzs7r6vdonnm2hpi52w6y@127.0.0.1:36106/gydnp"
     319        furl1 = "pb://62ubehyunnyhzs7r6vdonnm2hpi52w6y@127.0.0.1:36106/gydnp"
    320320
    321321        private_key, _ = ed25519.create_signing_keypair()
     
    415415                             u"nickname", "version", "oldest", fakeseq,
    416416                             FilePath(self.mktemp()))
    417         furl1 = b"pb://onug64tu@127.0.0.1:123/short" # base32("short")
     417        furl1 = "pb://onug64tu@127.0.0.1:123/short" # base32("short")
    418418        private_key, _ = ed25519.create_signing_keypair()
    419419
     
    437437            v = introducer.get_announcements()[0]
    438438            furl = v.announcement["anonymous-storage-FURL"]
    439             self.failUnlessEqual(ensure_binary(furl), furl1)
     439            self.failUnlessEqual(furl, furl1)
    440440        d.addCallback(_done)
    441441
     
    463463        tub = self.central_tub
    464464        ifurl = self.central_tub.registerReference(introducer, furlFile=iff)
    465         self.introducer_furl = ifurl.encode("utf-8")
     465        self.introducer_furl = ifurl
    466466
    467467        # we have 5 clients who publish themselves as storage servers, and a
     
    504504            expected_announcements[i] += 1 # all expect a 'storage' announcement
    505505
    506             node_furl = tub.registerReference(Referenceable()).encode("utf-8")
     506            node_furl = tub.registerReference(Referenceable())
    507507            private_key, public_key = ed25519.create_signing_keypair()
    508508            public_key_str = ed25519.string_from_verifying_key(public_key)
     
    521521            if i == 2:
    522522                # also publish something that nobody cares about
    523                 boring_furl = tub.registerReference(Referenceable()).encode("utf-8")
     523                boring_furl = tub.registerReference(Referenceable())
    524524                c.publish("boring", make_ann(boring_furl), private_key)
    525525
     
    659659            newfurl = self.central_tub.registerReference(self.the_introducer,
    660660                                                         furlFile=iff)
    661             assert ensure_binary(newfurl) == self.introducer_furl
     661            assert newfurl == self.introducer_furl
    662662        d.addCallback(_restart_introducer_tub)
    663663
     
    711711            newfurl = self.central_tub.registerReference(self.the_introducer,
    712712                                                         furlFile=iff)
    713             assert ensure_binary(newfurl) == self.introducer_furl
     713            assert newfurl == self.introducer_furl
    714714        d.addCallback(_restart_introducer)
    715715
     
    755755                                     "my_version", "oldest",
    756756                                     fakeseq, FilePath(self.mktemp()))
    757         #furl1 = b"pb://62ubehyunnyhzs7r6vdonnm2hpi52w6y@127.0.0.1:0/swissnum"
     757        #furl1 = "pb://62ubehyunnyhzs7r6vdonnm2hpi52w6y@127.0.0.1:0/swissnum"
    758758        #ann_s = make_ann_t(client_v2, furl1, None, 10)
    759759        #introducer.remote_publish_v2(ann_s, Referenceable())
     
    776776                                     "my_version", "oldest",
    777777                                     fakeseq, FilePath(self.mktemp()))
    778         furl1 = b"pb://62ubehyunnyhzs7r6vdonnm2hpi52w6y@127.0.0.1:0/swissnum"
     778        furl1 = "pb://62ubehyunnyhzs7r6vdonnm2hpi52w6y@127.0.0.1:0/swissnum"
    779779
    780780        private_key, public_key = ed25519.create_signing_keypair()
     
    791791        self.failUnlessEqual(a[0].service_name, "storage")
    792792        self.failUnlessEqual(a[0].version, "my_version")
    793         self.failUnlessEqual(ensure_binary(a[0].announcement["anonymous-storage-FURL"]), furl1)
     793        self.failUnlessEqual(a[0].announcement["anonymous-storage-FURL"], furl1)
    794794
    795795    def _load_cache(self, cache_filepath):
     
    824824        private_key, public_key = ed25519.create_signing_keypair()
    825825        public_key_str = remove_prefix(ed25519.string_from_verifying_key(public_key), b"pub-")
    826         furl1 = b"pb://onug64tu@127.0.0.1:123/short" # base32("short")
     826        furl1 = "pb://onug64tu@127.0.0.1:123/short" # base32("short")
    827827        ann_t = make_ann_t(ic, furl1, private_key, 1)
    828828
     
    835835        self.failUnlessEqual(ensure_binary(announcements[0]['key_s']), public_key_str)
    836836        ann = announcements[0]["ann"]
    837         self.failUnlessEqual(ensure_binary(ann["anonymous-storage-FURL"]), furl1)
     837        self.failUnlessEqual(ann["anonymous-storage-FURL"], furl1)
    838838        self.failUnlessEqual(ann["seqnum"], 1)
    839839
    840840        # a new announcement that replaces the first should replace the
    841841        # cached entry, not duplicate it
    842         furl2 = furl1 + b"er"
     842        furl2 = furl1 + "er"
    843843        ann_t2 = make_ann_t(ic, furl2, private_key, 2)
    844844        ic.got_announcements([ann_t2])
     
    848848        self.failUnlessEqual(ensure_binary(announcements[0]['key_s']), public_key_str)
    849849        ann = announcements[0]["ann"]
    850         self.failUnlessEqual(ensure_binary(ann["anonymous-storage-FURL"]), furl2)
     850        self.failUnlessEqual(ann["anonymous-storage-FURL"], furl2)
    851851        self.failUnlessEqual(ann["seqnum"], 2)
    852852
     
    855855        private_key2, public_key2 = ed25519.create_signing_keypair()
    856856        public_key_str2 = remove_prefix(ed25519.string_from_verifying_key(public_key2), b"pub-")
    857         furl3 = b"pb://onug64tu@127.0.0.1:456/short"
     857        furl3 = "pb://onug64tu@127.0.0.1:456/short"
    858858        ann_t3 = make_ann_t(ic, furl3, private_key2, 1)
    859859        ic.got_announcements([ann_t3])
     
    865865                             set([ensure_binary(a["key_s"]) for a in announcements]))
    866866        self.failUnlessEqual(set([furl2, furl3]),
    867                              set([ensure_binary(a["ann"]["anonymous-storage-FURL"])
     867                             set([a["ann"]["anonymous-storage-FURL"]
    868868                                  for a in announcements]))
    869869
     
    881881
    882882        self.failUnless(public_key_str in announcements)
    883         self.failUnlessEqual(ensure_binary(announcements[public_key_str]["anonymous-storage-FURL"]),
     883        self.failUnlessEqual(announcements[public_key_str]["anonymous-storage-FURL"],
    884884                             furl2)
    885         self.failUnlessEqual(ensure_binary(announcements[public_key_str2]["anonymous-storage-FURL"]),
     885        self.failUnlessEqual(announcements[public_key_str2]["anonymous-storage-FURL"],
    886886                             furl3)
    887887
     
    998998        # make sure we have a working base64.b32decode. The one in
    999999        # python2.4.[01] was broken.
    1000         furl = b'pb://t5g7egomnnktbpydbuijt6zgtmw4oqi5@127.0.0.1:51857/hfzv36i'
    1001         m = re.match(br'pb://(\w+)@', furl)
     1000        furl = 'pb://t5g7egomnnktbpydbuijt6zgtmw4oqi5@127.0.0.1:51857/hfzv36i'
     1001        m = re.match(r'pb://(\w+)@', furl)
    10021002        assert m
    1003         nodeid = b32decode(m.group(1).upper())
     1003        nodeid = b32decode(m.group(1).upper().encode("ascii"))
    10041004        self.failUnlessEqual(nodeid, b"\x9fM\xf2\x19\xcckU0\xbf\x03\r\x10\x99\xfb&\x9b-\xc7A\x1d")
    10051005
     
    10421042        ic = IntroducerClient(
    10431043            mock_tub,
    1044             b"pb://",
     1044            "pb://",
    10451045            u"fake_nick",
    10461046            "0.0.0",
  • TabularUnified src/allmydata/test/test_repairer.py

    r07e4fe84 rd76bea4  
    11# -*- coding: utf-8 -*-
     2"""
     3Ported to Python 3.
     4"""
    25from __future__ import print_function
     6from __future__ import absolute_import
     7from __future__ import division
     8from __future__ import unicode_literals
     9
     10from future.utils import PY2
     11if PY2:
     12    from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min  # noqa: F401
    313
    414from allmydata.test import common
     
    6373        c1 = self.g.clients[1]
    6474        c0.encoding_params['max_segment_size'] = 12
    65         d = c0.upload(upload.Data(common.TEST_DATA, convergence=""))
     75        d = c0.upload(upload.Data(common.TEST_DATA, convergence=b""))
    6676        def _stash_uri(ur):
    6777            self.uri = ur.get_uri()
     
    465475
    466476        d.addCallback(lambda ignored:
    467                       self.delete_shares_numbered(self.uri, range(3, 10+1)))
     477                      self.delete_shares_numbered(self.uri, list(range(3, 10+1))))
    468478        d.addCallback(lambda ignored: download_to_data(self.c1_filenode))
    469479        d.addCallback(lambda newdata:
     
    477487        d = self.upload_and_stash()
    478488        d.addCallback(lambda ignored:
    479                       self.delete_shares_numbered(self.uri, range(7)))
     489                      self.delete_shares_numbered(self.uri, list(range(7))))
    480490        d.addCallback(lambda ignored: self._stash_counts())
    481491        d.addCallback(lambda ignored:
     
    510520
    511521        d.addCallback(lambda ignored:
    512                       self.delete_shares_numbered(self.uri, range(3, 10+1)))
     522                      self.delete_shares_numbered(self.uri, list(range(3, 10+1))))
    513523        d.addCallback(lambda ignored: download_to_data(self.c1_filenode))
    514524        d.addCallback(lambda newdata:
     
    528538        # happiness setting.
    529539        def _delete_some_servers(ignored):
    530             for i in xrange(7):
     540            for i in range(7):
    531541                self.g.remove_server(self.g.servers_by_number[i].my_nodeid)
    532542
     
    641651                # unless it has already repaired the previously-corrupted share.
    642652                def _then_delete_7_and_try_a_download(unused=None):
    643                     shnums = range(10)
     653                    shnums = list(range(10))
    644654                    shnums.remove(shnum)
    645655                    random.shuffle(shnums)
     
    680690        self.set_up_grid()
    681691        c0 = self.g.clients[0]
    682         DATA = "a"*135
     692        DATA = b"a"*135
    683693        c0.encoding_params['k'] = 22
    684694        c0.encoding_params['n'] = 66
    685         d = c0.upload(upload.Data(DATA, convergence=""))
     695        d = c0.upload(upload.Data(DATA, convergence=b""))
    686696        def _then(ur):
    687697            self.uri = ur.get_uri()
  • TabularUnified src/allmydata/test/test_runner.py

    r07e4fe84 rd76bea4  
    143143
    144144class CreateNode(unittest.TestCase):
    145     # exercise "tahoe create-node", create-introducer,
    146     # create-key-generator, and create-stats-gatherer, by calling the
    147     # corresponding code as a subroutine.
     145    # exercise "tahoe create-node", create-introducer, and
     146    # create-key-generator by calling the corresponding code as a subroutine.
    148147
    149148    def workdir(self, name):
     
    244243        self.do_create("introducer", "--hostname=127.0.0.1")
    245244
    246     def test_stats_gatherer(self):
    247         self.do_create("stats-gatherer", "--hostname=127.0.0.1")
    248 
    249245    def test_subcommands(self):
    250246        # no arguments should trigger a command listing, via UsageError
    251247        self.failUnlessRaises(usage.UsageError, parse_cli,
    252248                              )
    253 
    254     @inlineCallbacks
    255     def test_stats_gatherer_good_args(self):
    256         rc,out,err = yield run_cli("create-stats-gatherer", "--hostname=foo",
    257                                    self.mktemp())
    258         self.assertEqual(rc, 0)
    259         rc,out,err = yield run_cli("create-stats-gatherer",
    260                                    "--location=tcp:foo:1234",
    261                                    "--port=tcp:1234", self.mktemp())
    262         self.assertEqual(rc, 0)
    263 
    264 
    265     def test_stats_gatherer_bad_args(self):
    266         def _test(args):
    267             argv = args.split()
    268             self.assertRaises(usage.UsageError, parse_cli, *argv)
    269 
    270         # missing hostname/location/port
    271         _test("create-stats-gatherer D")
    272 
    273         # missing port
    274         _test("create-stats-gatherer --location=foo D")
    275 
    276         # missing location
    277         _test("create-stats-gatherer --port=foo D")
    278 
    279         # can't provide both
    280         _test("create-stats-gatherer --hostname=foo --port=foo D")
    281 
    282         # can't provide both
    283         _test("create-stats-gatherer --hostname=foo --location=foo D")
    284 
    285         # can't provide all three
    286         _test("create-stats-gatherer --hostname=foo --location=foo --port=foo D")
    287249
    288250
  • TabularUnified src/allmydata/test/test_storage_client.py

    r07e4fe84 rd76bea4  
    104104)
    105105
    106 SOME_FURL = b"pb://abcde@nowhere/fake"
     106SOME_FURL = "pb://abcde@nowhere/fake"
    107107
    108108class NativeStorageServerWithVersion(NativeStorageServer):
     
    311311                # than the one that is enabled.
    312312                u"name": u"tahoe-lafs-dummy-v2",
    313                 u"storage-server-FURL": SOME_FURL.decode("ascii"),
     313                u"storage-server-FURL": SOME_FURL,
    314314            }],
    315315        }
     
    339339                # and this announcement is for a plugin with a matching name
    340340                u"name": plugin_name,
    341                 u"storage-server-FURL": SOME_FURL.decode("ascii"),
     341                u"storage-server-FURL": SOME_FURL,
    342342            }],
    343343        }
     
    390390                # and this announcement is for a plugin with a matching name
    391391                u"name": plugin_name,
    392                 u"storage-server-FURL": SOME_FURL.decode("ascii"),
     392                u"storage-server-FURL": SOME_FURL,
    393393            }],
    394394        }
     
    595595      anonymous-storage-FURL: {furl}
    596596      permutation-seed-base32: aaaaaaaaaaaaaaaaaaaaaaaa
    597 """.format(furl=SOME_FURL.decode("utf-8"))
     597""".format(furl=SOME_FURL)
    598598        servers = yamlutil.safe_load(servers_yaml)
    599599        permseed = base32.a2b(b"aaaaaaaaaaaaaaaaaaaaaaaa")
     
    611611        ann2 = {
    612612            "service-name": "storage",
    613             "anonymous-storage-FURL": "pb://{}@nowhere/fake2".format(base32.b2a(b"1")),
     613            "anonymous-storage-FURL": "pb://{}@nowhere/fake2".format(str(base32.b2a(b"1"), "utf-8")),
    614614            "permutation-seed-base32": "bbbbbbbbbbbbbbbbbbbbbbbb",
    615615        }
     
    695695
    696696        def add_one_server(x):
    697             data["anonymous-storage-FURL"] = b"pb://%s@spy:nowhere/fake" % (base32.b2a(b"%d" % x),)
     697            data["anonymous-storage-FURL"] = "pb://%s@spy:nowhere/fake" % (str(base32.b2a(b"%d" % x), "ascii"),)
    698698            tub = new_tub()
    699699            connects = []
  • TabularUnified src/allmydata/test/test_system.py

    r07e4fe84 rd76bea4  
    2424from allmydata.util.fileutil import abspath_expanduser_unicode
    2525from allmydata.util.consumer import MemoryConsumer, download_to_data
    26 from allmydata.stats import StatsGathererService
    2726from allmydata.interfaces import IDirectoryNode, IFileNode, \
    2827     NoSuchChildError, NoSharesError
     
    668667        self.sparent.startService()
    669668
    670         self.stats_gatherer = None
    671         self.stats_gatherer_furl = None
    672 
    673669    def tearDown(self):
    674670        log.msg("shutting down SystemTest services")
     
    714710
    715711    @inlineCallbacks
    716     def set_up_nodes(self, NUMCLIENTS=5, use_stats_gatherer=False):
     712    def set_up_nodes(self, NUMCLIENTS=5):
    717713        """
    718714        Create an introducer and ``NUMCLIENTS`` client nodes pointed at it.  All
     
    726722
    727723        :param int NUMCLIENTS: The number of client nodes to create.
    728 
    729         :param bool use_stats_gatherer: If ``True`` then also create a stats
    730             gatherer and configure the other nodes to use it.
    731724
    732725        :return: A ``Deferred`` that fires when the nodes have connected to
     
    738731        self.add_service(self.introducer)
    739732        self.introweb_url = self._get_introducer_web()
    740 
    741         if use_stats_gatherer:
    742             yield self._set_up_stats_gatherer()
    743733        yield self._set_up_client_nodes()
    744         if use_stats_gatherer:
    745             yield self._grab_stats()
    746 
    747     def _set_up_stats_gatherer(self):
    748         statsdir = self.getdir("stats_gatherer")
    749         fileutil.make_dirs(statsdir)
    750 
    751         location_hint, port_endpoint = self.port_assigner.assign(reactor)
    752         fileutil.write(os.path.join(statsdir, "location"), location_hint)
    753         fileutil.write(os.path.join(statsdir, "port"), port_endpoint)
    754         self.stats_gatherer_svc = StatsGathererService(statsdir)
    755         self.stats_gatherer = self.stats_gatherer_svc.stats_gatherer
    756         self.stats_gatherer_svc.setServiceParent(self.sparent)
    757 
    758         d = fireEventually()
    759         sgf = os.path.join(statsdir, 'stats_gatherer.furl')
    760         def check_for_furl():
    761             return os.path.exists(sgf)
    762         d.addCallback(lambda junk: self.poll(check_for_furl, timeout=30))
    763         def get_furl(junk):
    764             self.stats_gatherer_furl = file(sgf, 'rb').read().strip()
    765         d.addCallback(get_furl)
    766         return d
    767734
    768735    @inlineCallbacks
     
    834801                config.setdefault(section, {})[feature] = value
    835802
    836         setclient = partial(setconf, config, which, "client")
    837803        setnode = partial(setconf, config, which, "node")
    838804        sethelper = partial(setconf, config, which, "helper")
    839805
    840806        setnode("nickname", u"client %d \N{BLACK SMILING FACE}" % (which,))
    841 
    842         if self.stats_gatherer_furl:
    843             setclient("stats_gatherer.furl", self.stats_gatherer_furl)
    844807
    845808        tub_location_hint, tub_port_endpoint = self.port_assigner.assign(reactor)
     
    872835        fileutil.write(os.path.join(basedir, 'tahoe.cfg'), config)
    873836        return basedir
    874 
    875     def _grab_stats(self):
    876         d = self.stats_gatherer.poll()
    877         return d
    878837
    879838    def bounce_client(self, num):
     
    13041263
    13051264        def _grab_stats(ignored):
    1306             # the StatsProvider doesn't normally publish a FURL:
    1307             # instead it passes a live reference to the StatsGatherer
    1308             # (if and when it connects). To exercise the remote stats
    1309             # interface, we manually publish client0's StatsProvider
    1310             # and use client1 to query it.
    1311             sp = self.clients[0].stats_provider
    1312             sp_furl = self.clients[0].tub.registerReference(sp)
    1313             d = self.clients[1].tub.getReference(sp_furl)
    1314             d.addCallback(lambda sp_rref: sp_rref.callRemote("get_stats"))
    1315             def _got_stats(stats):
    1316                 #print("STATS")
    1317                 #from pprint import pprint
    1318                 #pprint(stats)
    1319                 s = stats["stats"]
    1320                 self.failUnlessEqual(s["storage_server.accepting_immutable_shares"], 1)
    1321                 c = stats["counters"]
    1322                 self.failUnless("storage_server.allocate" in c)
    1323             d.addCallback(_got_stats)
    1324             return d
     1265            stats = self.clients[0].stats_provider.get_stats()
     1266            s = stats["stats"]
     1267            self.failUnlessEqual(s["storage_server.accepting_immutable_shares"], 1)
     1268            c = stats["counters"]
     1269            self.failUnless("storage_server.allocate" in c)
    13251270        d.addCallback(_grab_stats)
    13261271
     
    16301575        self.basedir = "system/SystemTest/test_filesystem"
    16311576        self.data = LARGE_DATA
    1632         d = self.set_up_nodes(use_stats_gatherer=True)
     1577        d = self.set_up_nodes()
    16331578        def _new_happy_semantics(ign):
    16341579            for c in self.clients:
     
    26192564        def _run_in_subprocess(ignored, verb, *args, **kwargs):
    26202565            stdin = kwargs.get("stdin")
     2566            # XXX https://tahoe-lafs.org/trac/tahoe-lafs/ticket/3548
    26212567            env = kwargs.get("env", os.environ)
    26222568            # Python warnings from the child process don't matter.
  • TabularUnified src/allmydata/test/test_upload.py

    r07e4fe84 rd76bea4  
    240240        )
    241241        for (serverid, rref) in servers:
    242             ann = {"anonymous-storage-FURL": b"pb://%s@nowhere/fake" % base32.b2a(serverid),
     242            ann = {"anonymous-storage-FURL": "pb://%s@nowhere/fake" % str(base32.b2a(serverid), "ascii"),
    243243                   "permutation-seed-base32": base32.b2a(serverid) }
    244244            self.storage_broker.test_add_rref(serverid, rref, ann)
  • TabularUnified src/allmydata/util/_python3.py

    r07e4fe84 rd76bea4  
    3636    "allmydata.crypto.util",
    3737    "allmydata.hashtree",
     38    "allmydata.immutable.checker",
    3839    "allmydata.immutable.downloader",
    3940    "allmydata.immutable.downloader.common",
     
    5051    "allmydata.immutable.literal",
    5152    "allmydata.immutable.offloaded",
     53    "allmydata.immutable.repairer",
    5254    "allmydata.immutable.upload",
    5355    "allmydata.interfaces",
     56    "allmydata.introducer.client",
     57    "allmydata.introducer.common",
    5458    "allmydata.introducer.interfaces",
     59    "allmydata.introducer.server",
    5560    "allmydata.monitor",
    5661    "allmydata.mutable.checker",
     
    152157    "allmydata.test.test_pipeline",
    153158    "allmydata.test.test_python3",
     159    "allmydata.test.test_repairer",
    154160    "allmydata.test.test_spans",
    155161    "allmydata.test.test_statistics",
  • TabularUnified src/allmydata/util/encodingutil.py

    r07e4fe84 rd76bea4  
    252252
    253253ESCAPABLE_8BIT    = re.compile( br'[^ !#\x25-\x5B\x5D-\x5F\x61-\x7E]', re.DOTALL)
     254
     255def quote_output_u(*args, **kwargs):
     256    """
     257    Like ``quote_output`` but always return ``unicode``.
     258    """
     259    result = quote_output(*args, **kwargs)
     260    if isinstance(result, unicode):
     261        return result
     262    return result.decode(kwargs.get("encoding", None) or io_encoding)
     263
    254264
    255265def quote_output(s, quotemarks=True, quote_newlines=None, encoding=None):
  • TabularUnified src/allmydata/web/statistics.xhtml

    r07e4fe84 rd76bea4  
    1313
    1414    <ul>
    15       <li>Load Average: <t:transparent t:render="load_average" /></li>
    16       <li>Peak Load: <t:transparent t:render="peak_load" /></li>
    1715      <li>Files Uploaded (immutable): <t:transparent t:render="uploads" /></li>
    1816      <li>Files Downloaded (immutable): <t:transparent t:render="downloads" /></li>
  • TabularUnified src/allmydata/web/status.py

    r07e4fe84 rd76bea4  
    15671567
    15681568    @renderer
    1569     def load_average(self, req, tag):
    1570         return tag(str(self._stats["stats"].get("load_monitor.avg_load")))
    1571 
    1572     @renderer
    1573     def peak_load(self, req, tag):
    1574         return tag(str(self._stats["stats"].get("load_monitor.max_load")))
    1575 
    1576     @renderer
    15771569    def uploads(self, req, tag):
    15781570        files = self._stats["counters"].get("uploader.files_uploaded", 0)
Note: See TracChangeset for help on using the changeset viewer.