zoukankan      html  css  js  c++  java
  • MinIO服务搭建和使用

      MinIO 是一个基于Apache License v2.0开源协议的对象存储服务,采用Go语言开发的软件。它兼容亚马逊S3云存储服务接口,非常适合于存储大容量非结构化的数据,例如图片、视频、日志文件、备份数据和容器/虚拟机镜像等,而一个对象文件可以是任意大小,从几kb到最大5T不等。MinIO是一个非常轻量的服务,可以很简单的和其他应用的结合,类似 NodeJS, Redis 或者 MySQL。

    官网:https://docs.min.io/cn/minio-quickstart-guide.html

    Docker搭建:

    docker run -p 9000:9000 --name minio1 
        -e "MINIO_ACCESS_KEY=admin" 
        -e "MINIO_SECRET_KEY=admin" 
        -v /mnt/data:/data 
        -v /mnt/config:/root/.minio 
        docker.io/minio/minio:latest server /data
    

      注意:前提需要创建好对应的本地映射文件目录,搭建好后,访问本级IP:9000端口,输入设置的密码可以访问其中的文件,该服务常常在项目中用来搭建一些文件共享平台。

    简单实例

      根据官网的第一个例子,可能会出现如下问题:minio.error.InvalidXMLError:raise InvalidXMLError('"Error" XML is not parsable.),这个原因是没有写端口。

    from minio import Minio
    from minio.error import (ResponseError, BucketAlreadyOwnedByYou,
                             BucketAlreadyExists)
    
    # 使用endpoint、access key和secret key来初始化minioClient对象。
    minioClient = Minio('10.2.1.111:9000',
                        access_key='admin1234',
                        secret_key='admin1234',
                        secure=False)
    
    # 调用make_bucket来创建一个存储桶。
    try:
        minioClient.make_bucket("maylogs", location="us-east-1")
    except BucketAlreadyOwnedByYou as err:
        pass
    except BucketAlreadyExists as err:
        pass
    except ResponseError as err:
        raise
    
    # Put an object 'pumaserver_debug.log' with contents from 'pumaserver_debug.log'.
    try:
        minioClient.fput_object('maylogs', 'pumaserver_debug.log', './requirements.txt')
    except ResponseError as err:
        print(err)
    

      

    使用python操作Minio的上传下载

    # -*- coding: utf-8 -*-
    """
    @Time : 2020/8/4 2:59 下午
    @Auth : wangli
    """
    from minio import Minio
    import os
    
    
    def urljoin(*args) -> str:
        """
        Joins given arguments into a url. Trailing but not leading slashes are
        stripped for each argument.
        """
        url = "/".join(map(lambda x: str(x).rstrip('/'), args))
        return url
    
    
    class MinioImp:
    
        def __init__(self, host, acc_key, sec_key, cache_loc):
            """
            :param host: Minio host:port
            :param acc_key: user key
            :param sec_key:  secret key
            :param cache_loc:  localtion path
            """
            # Initialize Minio instance
            self._s3_client = Minio(host, access_key=acc_key, secret_key=sec_key, secure=False)
            self.cache_location = cache_loc
    
        def get_cachefile_path(self, bucket_name, object_name):
            # Create the destination filepath
            cachefile_path = self.cache_location + "/" + bucket_name + "/" + object_name
            return cachefile_path
    
        def get_partial(self, bucket_name, object_name, start, size):
            # Gets data from offset to length of an object
            return self._s3_client.get_partial_object(bucket_name, object_name, start, size)
    
        def object_exists(self, bucket_name, object_name):
            """
               Check whether the object actually exists
            """
            try:
                _ = self._s3_client.stat_object(bucket_name, object_name)
                return True
            except BaseException:
                return False
    
        def download(self, bucket_name, object_name):
            """
               Download the object to the cache as a file.
               If it already exists then check whether the object on the object store is newer than the file on the disk.
               :return: string filename in the cache
            """
            # get the path in the cache
            dest_path = self.get_cachefile_path(bucket_name, object_name)
    
            # First check whether the file exists
            if os.path.exists(dest_path):
                # these exceptions shouldn't really happen but I'm writing particularly defensive code!
                try:
                    object_stats = self._s3_client.stat_object(bucket_name,
                                                               object_name)
                    # get object information and metadata of an object.
                except BaseException:
                    raise "Error: " + dest_path + " not found."
                # get the date of the corresponding file on the file system
                try:
                    file_stats = os.stat(dest_path)
                except BaseException:
                    raise IOError("Error: " + dest_path + " not found.")
    
                # the object on the object store is newer than the one on the disk
                if object_stats.last_modified > file_stats.st_mtime:
                    # Redownload the file
                    try:
                        self._s3_client.fget_object(bucket_name, object_name, dest_path)
                    except BaseException:
                        raise "Error: " + dest_path + " not found."
    
            else:
                # Does not exist so we have to download the file
                # first create the destination directory, if it doesn't exist
                dest_dir = os.path.dirname(dest_path)
                if not os.path.isdir(dest_dir):
                    os.makedirs(dest_dir)
                # now try downloading the file
                try:
                    self._s3_client.fget_object(bucket_name, object_name, dest_path)
                except BaseException:
                    raise "Error: " + dest_path + " not found."
    
            return dest_path
    
        def create_bucket(self, bucket_name):
            """
               Create a bucket on S3 storage
            """
            # full url for error reporting
            full_url = urljoin(self.cache_location, bucket_name)
            # check the bucket exists
            if not self._s3_client.bucket_exists(bucket_name):
                try:
                    self._s3_client.make_bucket(bucket_name)
                except BaseException as e:
                    raise "Error: " + full_url + " cannot create bucket."
    
        def delete_bucket(self, bucket_name):
            self._s3_client.remove_bucket(bucket_name)
    
        def delete(self, bucket_name, object_name):
            """Delete a single object"""
            self._s3_client.remove_object(bucket_name, object_name)
    
        def uploadload(self, bucket_name, object_name):
            """
               Write a file in the cache to the s3 storage
            """
            # full url for error reporting
            full_url = urljoin(self.cache_location, bucket_name, object_name)
            # get the path in the cache
            s3_cache_filename = self.get_cachefile_path(bucket_name, object_name)
    
            # check the file exists in the cache
            if not (os.path.exists(s3_cache_filename)):
                raise "Error: " + s3_cache_filename + " file not found in cache."
    
            # create the bucket if necessary
            self.create_bucket(bucket_name)
    
            try:
                self._s3_client.fput_object(bucket_name, object_name, s3_cache_filename)
            except BaseException as e:
                raise "Error: " + full_url + " cannot write S3 object: " + str(e)
    
    
    if __name__ == '__main__':
        minio = MinioImp('10.2.1.1:9000', 'admin4', 'admin', '.')
    
        # 下载
        # minio.download('maylogs', 'pumaserver_debug.log')
    
        # 上传
        # minio.uploadload('maylogs', 'test.py')
    

      

  • 相关阅读:
    HttpServletRequest request 获取form参数的两种方式
    mysql中计算两个日期的时间差函数TIMESTAMPDIFF用法
    Better exception message for missing @RequestBody method parameter
    下载Xcode历史版本方法
    CGLib与JDK的动态代理
    LeetCode: Longest Consecutive Sequence [128]
    POJ训练计划2777_Count Color(线段树/成段更新/区间染色)
    JBoss AS 7性能调优(三)
    java的Future使用方法
    Redis学习资源
  • 原文地址:https://www.cnblogs.com/double-W/p/13434013.html
Copyright © 2011-2022 走看看