最近在搞直接一个类似于二维码的东西,同样也是需要获取其中某个区域的图片。直接上最为主要的一些代码吧。

下面这个是初始化AV部分,这样就可以将图像在view上面展示了。这里简单的阐述一下在这其中碰到的问题和解决方法。

1.如果在layer上面搞出一个“洞 ”,就是真正的裁剪区域,在这里用的是CAShapeLayer,利用fillMode,这样就可以通过mask方式作用在将覆盖在perviewLayer上面的coverLayer了。

2. 我们可以很容易的拿到整个的image,就可以在delegate中的sampleBuffer中拿到了。这里我使用的是AVCaptureVideoDataOutput,这样就可以不断的获取到采样的流了。

3. 从整个image中拿到裁剪区域中的图片。在这个问题上面花了不少时间和心思始终不能正确的拿到裁剪区域的图像。先是用了CGImageFromImage ,一个是才出来的图片位置和大小不对。之后转用cgcontext的方式。但是还是不太对。不断的使用google啊,怎么搞呢,琢磨啊。因为刚开始layer的呈现方式是fill的,这样实际的图片大小并不是和屏幕的大小是一样的。思前想后,可以确定是这个问题了,然后开始吧。针对不同的videoGravity的方式计算出裁剪区域实际在图片中对象的位置和大小,于是就有了一个calcRect的方法,这个方法就是将之前在屏幕上挖出来的“洞”对应到图片中的位置去。

总算是搞出来了。有兴趣的看看吧。

?
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
<pre name="code" class="objc">// 
//  ScanView.m 
//  xxoo 
// 
//  Created by Tommy on 13-11-6. 
//  Copyright (c) 2013年 Tommy. All rights reserved. 
// 
   
#import "ScanView.h" 
#import <AVFoundation/AVFoundation.h> 
   
   
static inline double radians (double degrees) {return degrees * M_PI/180;} 
   
@interface ScanView()<AVCaptureVideoDataOutputSampleBufferDelegate> 
   
@property AVCaptureVideoPreviewLayer* previewLayer; 
@property AVCaptureSession* session; 
@property AVCaptureDevice* videoDevice; 
@property dispatch_queue_t camera_sample_queue; 
@property CALayer* coverLayer; 
@property CAShapeLayer* cropLayer; 
@property CALayer* stillImageLayer; 
@property  AVCaptureStillImageOutput* stillImageOutput; 
   
@property UIImageView* stillImageView; 
@property UIImage* cropImage; 
   
@property BOOL hasSetFocus; 
   
   
   
@end 
   
@implementation ScanView 
   
- (id)initWithFrame:(CGRect)frame 
    self = [super initWithFrame:frame]; 
    if (self) { 
        // Initialization code 
        self.hasSetFocus = NO; 
        [self initAVCaptuer]; 
        [self initOtherLayers]; 
    
    return self; 
   
/*
// Only override drawRect: if you perform custom drawing.
// An empty implementation adversely affects performance during animation.
- (void)drawRect:(CGRect)rect
{
    // Drawing code
}
*/ 
-(void)layoutSubviews 
    [self.previewLayer setFrame:self.bounds]; 
    [self.coverLayer setFrame:self.bounds]; 
    self.coverLayer.mask = self.cropLayer; 
   
- (void) initAVCaptuer{ 
       
    self.cropRect = CGRectZero; 
       
    self.videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; 
    AVCaptureDeviceInput* input = [[AVCaptureDeviceInput alloc]initWithDevice:self.videoDevice error:nil]; 
       
    AVCaptureVideoDataOutput* output = [[AVCaptureVideoDataOutput alloc]init]; 
    output.alwaysDiscardsLateVideoFrames = YES; 
    self.camera_sample_queue = dispatch_queue_create ("com.scan.video.sample_queue", DISPATCH_QUEUE_SERIAL); 
    [output setSampleBufferDelegate:self queue:self.camera_sample_queue]; 
       
    NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey; 
    NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]; 
    NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key]; 
    [output setVideoSettings:videoSettings]; 
       
       
    self.stillImageOutput = [[AVCaptureStillImageOutput alloc]init]; 
    NSDictionary* outputSettings = @{AVVideoCodecKey:AVVideoCodecJPEG}; 
    [self.stillImageOutput setOutputSettings:outputSettings]; 
       
    self.session = [[AVCaptureSession alloc]init]; 
    self.session.sessionPreset = AVCaptureSessionPresetMedium; 
       
    if ([self.session canAddInput:input]) 
    
        [self.session addInput:input]; 
           
        if ([self.session canAddOutput:output]) 
        
            [self.session addOutput:self.stillImageOutput]; 
            [self.session addOutput:output]; 
               
            self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.session]; 
            self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspect; 
             
            [self.layer addSublayer: self.previewLayer]; 
               
            return; // success 
        
    
       
    self.session = nil; 
   
- (void)setCropRect:(CGRect)cropRect 
    _cropRect = cropRect; 
    if(!CGRectEqualToRect(CGRectZero, self.cropRect)){ 
   
        self.cropLayer = [[CAShapeLayer alloc] init]; 
        CGMutablePathRef path = CGPathCreateMutable(); 
           
        CGPathAddRect(path, nil, self.cropRect); 
        CGPathAddRect(path, nil, self.bounds); 
           
        [self.cropLayer setFillRule:kCAFillRuleEvenOdd]; 
        [self.cropLayer setPath:path]; 
        [self.cropLayer setFillColor:[[UIColor whiteColor] CGColor]]; 
           
        [self.cropLayer setNeedsDisplay]; 
           
        //[self setVideoFocus]; 
           
    
       
    [self.stillImageLayer setFrame:CGRectMake(100, 450, CGRectGetWidth(cropRect), CGRectGetHeight(cropRect))]; 
   
- (void) setVideoFocus{ 
       
    NSError *error; 
    CGPoint foucsPoint = CGPointMake(CGRectGetMidX(self.cropRect), CGRectGetMidY(self.cropRect)); 
    if([self.videoDevice isFocusPointOfInterestSupported] 
       &&[self.videoDevice lockForConfiguration:&error] &&!self.hasSetFocus){ 
        self.hasSetFocus = YES; 
        [self.videoDevice setFocusPointOfInterest:[self convertToPointOfInterestFromViewCoordinates:foucsPoint]]; 
        [self.videoDevice setFocusMode:AVCaptureFocusModeContinuousAutoFocus]; 
        [self.videoDevice unlockForConfiguration]; 
    
//    [self.videoDevice setFocusMode:AVCaptureFocusModeAutoFocus]; 
    NSLog(@"error:%@",error); 
       
   
   
- (CGPoint)convertToPointOfInterestFromViewCoordinates:(CGPoint)viewCoordinates 
    CGPoint pointOfInterest = CGPointMake(.5f, .5f); 
    CGSize frameSize = self.frame.size; 
       
    AVCaptureVideoPreviewLayer *videoPreviewLayer = self.previewLayer; 
       
    if ([self.previewLayer isMirrored]) { 
        viewCoordinates.x = frameSize.width - viewCoordinates.x; 
    
       
    if ( [[videoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResize] ) { 
        pointOfInterest = CGPointMake(viewCoordinates.y / frameSize.height, 1.f - (viewCoordinates.x / frameSize.width)); 
    } else
        CGRect cleanAperture; 
        for (AVCaptureInputPort *port in [[[[self session] inputs] lastObject] ports]) { 
            if ([port mediaType] == AVMediaTypeVideo) { 
                cleanAperture = CMVideoFormatDescriptionGetCleanAperture([port formatDescription], YES); 
                CGSize apertureSize = cleanAperture.size; 
                CGPoint point = viewCoordinates; 
                   
                CGFloat apertureRatio = apertureSize.height / apertureSize.width; 
                CGFloat viewRatio = frameSize.width / frameSize.height; 
                CGFloat xc = .5f; 
                CGFloat yc = .5f; 
                   
                if ( [[videoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspect] ) { 
                    if (viewRatio > apertureRatio) { 
                        CGFloat y2 = frameSize.height; 
                        CGFloat x2 = frameSize.height * apertureRatio; 
                        CGFloat x1 = frameSize.width; 
                        CGFloat blackBar = (x1 - x2) / 2
                        if (point.x >= blackBar && point.x <= blackBar + x2) { 
                            xc = point.y / y2; 
                            yc = 1.f - ((point.x - blackBar) / x2); 
                        
                    } else
                        CGFloat y2 = frameSize.width / apertureRatio; 
                        CGFloat y1 = frameSize.height; 
                        CGFloat x2 = frameSize.width; 
                        CGFloat blackBar = (y1 - y2) / 2
                        if (point.y >= blackBar && point.y <= blackBar + y2) { 
                            xc = ((point.y - blackBar) / y2); 
                            yc = 1.f - (point.x / x2); 
                        
                    
                } else if ([[videoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspectFill]) { 
                    if (viewRatio > apertureRatio) { 
                        CGFloat y2 = apertureSize.width * (frameSize.width / apertureSize.height); 
                        xc = (point.y + ((y2 - frameSize.height) / 2.f)) / y2; 
                        yc = (frameSize.width - point.x) / frameSize.width; 
                    } else
                        CGFloat x2 = apertureSize.height * (frameSize.height / apertureSize.width); 
                        yc = 1.f - ((point.x + ((x2 - frameSize.width) / 2)) / x2); 
                        xc = point.y / frameSize.height; 
                    
                       
                
                   
                pointOfInterest = CGPointMake(xc, yc); 
                break
            
        
    
       
    return pointOfInterest; 
   
- (void) initOtherLayers{ 
    self.coverLayer = [CALayer layer]; 
       
    self.coverLayer.backgroundColor = [[[UIColor blackColor] colorWithAlphaComponent:0.6] CGColor]; 
    [self.layer addSublayer:self.coverLayer]; 
       
    if(!CGRectEqualToRect(CGRectZero, self.cropRect)){ 
       
        self.cropLayer = [[CAShapeLayer alloc] init]; 
        CGMutablePathRef path = CGPathCreateMutable(); 
           
        CGPathAddRect(path, nil, self.cropRect); 
        CGPathAddRect(path, nil, self.bounds); 
           
        [self.cropLayer setFillRule:kCAFillRuleEvenOdd]; 
        [self.cropLayer setPath:path]; 
        [self.cropLayer setFillColor:[[UIColor redColor] CGColor]]; 
    
       
    self.stillImageLayer = [CALayer layer]; 
    self.stillImageLayer.backgroundColor = [[UIColor yellowColor] CGColor]; 
    self.stillImageLayer.contentsGravity = kCAGravityResizeAspect; 
    [self.coverLayer addSublayer:self.stillImageLayer]; 
       
       
    self.stillImageView = [[UIImageView alloc]initWithFrame:CGRectMake(0,300, 100, 100)]; 
    self.stillImageView.backgroundColor = [UIColor redColor]; 
    self.stillImageView.contentMode = UIViewContentModeScaleAspectFit; 
    [self addSubview:self.stillImageView]; 
       
       
    self.previewLayer.contentsGravity = kCAGravityResizeAspect; 
       
   
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{ 
       
       
    [self setVideoFocus]; 
       
    UIImage *image = [self imageFromSampleBuffer:sampleBuffer]; 
    self.cropImage = [self cropImageInRect:image]; 
       
    dispatch_async(dispatch_get_main_queue(), ^{ 
           
       [self.stillImageView setImage:image]; 
      // [self.stillImageLayer setContents:(id)[self.cropImage CGImage]]; 
    }); 
       
// 通过抽样缓存数据创建一个UIImage对象 
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer 
    // 为媒体数据设置一个CMSampleBuffer的Core Video图像缓存对象 
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
    // 锁定pixel buffer的基地址 
    CVPixelBufferLockBaseAddress(imageBuffer, 0); 
       
    // 得到pixel buffer的基地址 
    voidvoid *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); 
       
    // 得到pixel buffer的行字节数 
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    // 得到pixel buffer的宽和高 
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer); 
       
    //NSLog(@"%zu,%zu",width,height); 
       
    // 创建一个依赖于设备的RGB颜色空间 
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 
       
    // 用抽样缓存的数据创建一个位图格式的图形上下文(graphics context)对象 
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8
                                                 bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
       
   
    // 根据这个位图context中的像素数据创建一个Quartz image对象 
    CGImageRef quartzImage = CGBitmapContextCreateImage(context); 
    // 解锁pixel buffer 
    CVPixelBufferUnlockBaseAddress(imageBuffer,0); 
       
    // 释放context和颜色空间 
    CGContextRelease(context); 
    CGColorSpaceRelease(colorSpace); 
       
//    cgimageget` 
       
    // 用Quartz image创建一个UIImage对象image 
    //UIImage *image = [UIImage imageWithCGImage:quartzImage]; 
    UIImage *image = [UIImage imageWithCGImage:quartzImage scale:1.0f orientation:UIImageOrientationRight]; 
       
    // 释放Quartz image对象 
    CGImageRelease(quartzImage); 
       
    return (image); 
       
       
   
   
   
- (CGRect) calcRect:(CGSize)imageSize{ 
    NSString* gravity = self.previewLayer.videoGravity; 
    CGRect cropRect = self.cropRect; 
    CGSize screenSize = self.previewLayer.bounds.size; 
       
    CGFloat screenRatio = screenSize.height / screenSize.width ; 
    CGFloat imageRatio = imageSize.height /imageSize.width; 
       
    CGRect presentImageRect = self.previewLayer.bounds; 
    CGFloat scale = 1.0
       
       
    if([AVLayerVideoGravityResizeAspect isEqual: gravity]){ 
           
        CGFloat presentImageWidth = imageSize.width; 
        CGFloat presentImageHeigth = imageSize.height; 
        if(screenRatio > imageRatio){ 
            presentImageWidth = screenSize.width; 
            presentImageHeigth = presentImageWidth * imageRatio; 
               
        }else
            presentImageHeigth = screenSize.height; 
            presentImageWidth = presentImageHeigth / imageRatio; 
        
           
        presentImageRect.size = CGSizeMake(presentImageWidth, presentImageHeigth); 
        presentImageRect.origin = CGPointMake((screenSize.width-presentImageWidth)/2.0, (screenSize.height-presentImageHeigth)/2.0); 
       
    }else if([AVLayerVideoGravityResizeAspectFill isEqual:gravity]){ 
           
        CGFloat presentImageWidth = imageSize.width; 
        CGFloat presentImageHeigth = imageSize.height; 
        if(screenRatio > imageRatio){ 
            presentImageHeigth = screenSize.height; 
            presentImageWidth = presentImageHeigth / imageRatio; 
        }else
            presentImageWidth = screenSize.width; 
            presentImageHeigth = presentImageWidth * imageRatio; 
        
           
        presentImageRect.size = CGSizeMake(presentImageWidth, presentImageHeigth); 
        presentImageRect.origin = CGPointMake((screenSize.width-presentImageWidth)/2.0, (screenSize.height-presentImageHeigth)/2.0); 
           
    }else
        NSAssert(0, @"dont support:%@",gravity); 
    
       
    scale = CGRectGetWidth(presentImageRect) / imageSize.width; 
       
    CGRect rect = cropRect; 
    rect.origin = CGPointMake(CGRectGetMinX(cropRect)-CGRectGetMinX(presentImageRect), CGRectGetMinY(cropRect)-CGRectGetMinY(presentImageRect)); 
       
    rect.origin.x /= scale; 
    rect.origin.y /= scale; 
    rect.size.width /= scale; 
    rect.size.height  /= scale; 
       
    return rect; 
   
#define SUBSET_SIZE 360 
   
- (UIImage*) cropImageInRect:(UIImage*)image{ 
   
    CGSize size = [image size]; 
    CGRect cropRect = [self calcRect:size]; 
   
    float scale = fminf(1.0f, fmaxf(SUBSET_SIZE / cropRect.size.width, SUBSET_SIZE / cropRect.size.height)); 
    CGPoint offset = CGPointMake(-cropRect.origin.x, -cropRect.origin.y); 
       
    size_t subsetWidth = cropRect.size.width * scale; 
    size_t subsetHeight = cropRect.size.height * scale; 
       
       
    CGColorSpaceRef grayColorSpace = CGColorSpaceCreateDeviceGray(); 
       
    CGContextRef ctx = 
    CGBitmapContextCreate(nil, 
                          subsetWidth, 
                          subsetHeight, 
                          8
                          0
                          grayColorSpace, 
                          kCGImageAlphaNone|kCGBitmapByteOrderDefault); 
    CGColorSpaceRelease(grayColorSpace); 
    CGContextSetInterpolationQuality(ctx, kCGInterpolationNone); 
    CGContextSetAllowsAntialiasing(ctx, false); 
   
    // adjust the coordinate system 
    CGContextTranslateCTM(ctx, 0.0, subsetHeight); 
    CGContextScaleCTM(ctx, 1.0, -1.0); 
       
       
    UIGraphicsPushContext(ctx); 
    CGRect rect = CGRectMake(offset.x * scale, offset.y * scale, scale * size.width, scale * size.height); 
   
    [image drawInRect:rect]; 
       
    UIGraphicsPopContext(); 
       
    CGContextFlush(ctx); 
       
       
    CGImageRef subsetImageRef = CGBitmapContextCreateImage(ctx); 
       
    UIImage* subsetImage = [UIImage imageWithCGImage:subsetImageRef]; 
   
    CGImageRelease(subsetImageRef); 
       
    CGContextRelease(ctx); 
   
       
    return subsetImage; 
}   
   
   
   
- (void) start{ 
       
    dispatch_sync (self.camera_sample_queue, ^{ 
        [self.session startRunning]; }); 
       
- (void) stop{ 
    if(self.session){ 
        [self.session stopRunning]; 
    
       
   
   
@end 
</pre><br><br> 

Avfoundation 相机指定裁剪区域相关推荐

  1. 相机截图(区域截图、长图、大图)

    单独创建一个相机和画布实现截取一部分图,或者长图,传入的obj需含有"RectTransform"便于获取所截区域的大小,即obj的尺寸就是截图的尺寸. 我传入的是Content对 ...

  2. OpenGL概念辨析: 窗口,视口,裁剪区域

    1.窗口:这就不用解释了吧 2.视口:就是窗口中用来显示图形的一块矩形区域,它可以和窗口等大,也可以比窗口大或者小.只有绘制在视口区域中的图形才能被显示,如果图形有一部分超出了视口区域,那么那一部分是 ...

  3. cocos creator 调用相机相册裁剪图片并上传到服务器

    大致思路就是creator里面js调用Java和object-c代码,调起系统相机相册,选取图库图片/拍照图片进行裁剪,然后转化为base64字符串,最后通过http post请求上传到服务器. Ja ...

  4. HTML5 Canvas 裁剪区域

    裁剪区域 Canvas中的剪辑区域,是由路径所定义的一块区域,浏览器会把所有的绘图操作都限制在该区域内执行. Canvas中,使用clip()方法来设定剪辑区域,一旦设定好裁剪区域,则只有落在裁剪区域 ...

  5. CSS3 背景裁剪区域 background-clip属性

    默认情况下,背景颜色的显示范围为 border-box 区域,不重复的背景图像的显示范围为 padding-box 区域,而重复的背景图像的显示范围为 border-box 区域.然而,有时候却希望控 ...

  6. RuoYi-Vue——裁剪区域头像回显的跨域问题

    使用若依的小伙伴都知道,若依的头像上传默认是创建文件夹保存图片,但我们在实际开发中常常需要把图片保存在oss上,当我们修改头像后,再次修改头像就会出现跨域问题,裁剪区域无法显示图片,下面我将分享一下我 ...

  7. RevitAPI: 修改视图View裁剪区域Cropbox的大小

    有客户问道怎么修改视图的裁剪区域没有反应呢? 他是这么做的,首先创建一个视图,然后修改它的裁剪区域: ViewFamilyType vTypeElev = Class1.getviewfamilyty ...

  8. Android 7.0 适配 FileProvider相机 相册 裁剪的使用

    Android7.0又加了一些新的东西,例如多窗口 通知栏发消息 等等,但是这些是一些功能的改变,可以说是很炫的进步,而我要说的是FileProvider这个杀千刀的Api,先来看一下官方解释: 上面 ...

  9. iOS自定义裁剪区域,正方形圆形图片头像裁剪,仿QQ头像裁剪,圆形遮罩,矩型遮罩

    最近项目中用到了自定义图片裁剪区域的图片裁剪功能,自己写了一个,可能有诸多不完善的地方,请大家指正. 支持任意区域裁剪,9:16裁剪.16:9裁剪.1:1裁剪.圆形裁剪等等,总之裁剪框的大小,裁剪框的 ...

最新文章

  1. lucene DocValues——没有看懂
  2. python菜鸟教程h-Python for 循环语句
  3. springnbsp;security总结nbsp;太有用了!!
  4. Questasim10.6c下载与安装教程
  5. mat opencv 修改roi_设置图片ROI(OpenCV学习笔记之二)
  6. 第四十五期:万亿级日访问量下,Redis在微博的9年优化历程
  7. 针对 SQL Server 2008 在Windows Server 2008上的访问配置 Windows 防火墙
  8. c++ 返回空对象_python中file对象的常用方法
  9. linux好用的下载工具,四款linux下的好工具
  10. manjaro i3wm 的一些配置
  11. 根据RGB配色改变图片颜色
  12. 统计学基础(四)—卡方检验怎么用?
  13. 计算机flops测试,谁知道哪个软件可以测试CPU是多少GFLOPS?
  14. 普通大一学生的自我反思
  15. 推荐歌曲 一百首最经典的歌曲下载
  16. 记录12款MacBook Pro MC946,A1398拆主板换新喇叭的过程
  17. 华东理工大学王昊奋博士VAG小组学术报告
  18. 《离散数学及其应用》读书笔记【三】计数
  19. android登录系统论文,基于Android的物业管理信息系统的设计与实现.docx
  20. java数组游戏_基于java的挖地雷游戏

热门文章

  1. linux云主机登陆教程,登录linux云服务器的详情步骤
  2. 文献阅读 | 利用体细胞的mtDNA的突变追踪细胞的分化命运
  3. 3D打印切片软件Cura及CuraEngine原理分析
  4. pptx库ppt演示 python_通过python-pptx模块操作ppt文件的方法
  5. 图灵学院专用-- 00JVM参数手册
  6. jenkins 解决服务器远程启动jar方法无响应的方法
  7. 论文参考文献格式自动生成
  8. 计算机主板复位电路的组成,电脑主板复位电路工作原理
  9. Multisim基础 电流控制的电流源 简单示例
  10. 当当年中庆,百万自营图书大放价,又有羊毛可以薅了