Articles about European Sharepoint Hosting Service
Posts tagged cheap europe sharepoint 2013 hosting

SharePoint 2013 Hosting – HostForLIFE.eu :: How to Uploading Large Files In SharePoint Online?
Feb 23rd
By default, client-side scripting has some restrictions while posting large sizes of data in different browsers.
While using Angular HTTP post to upload a file, I’m facing an issue — more than a 100MB file upload breaks in some of the browsers like Chrome, Firefox, etc.
Now, I’m using file chunking to post the file as split chunks by SharePoint REST API.
Angular + SharePoint Large file upload
Follow the below-listed steps to upload the large files.
Step1
Create a “File upload” component template like below.
1 |
<input class="upload form-control" id="DocUploader" placeholder="Upload file" type="file" (change)="UploadFile($event)"> |
Step2
Below is the TypeScript code for the file upload component.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 |
import { Component } from '@angular/core'; @Component({ selector: 'app-fileupload', templateUrl: './fileupload.component.html', }) export class FileUploadComponent { UploadFile(event: any) { let fileList: FileList = event.target.files; if (fileList.length != 0) { this.fileUploadService.fileUpload(fileList[0].name, "Documents", fileList[0].name).then(addFileToFolder => { console.log("File Uploaded Successfully"); }).catch(addFileToFolderError => { console.log(addFileToFolderError); }); } } } |
Step3
The important part of chunk based file upload implementation is Angular service. FileUploadService code is mentioned below.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 |
import { Injectable, EventEmitter } from '@angular/core'; declare var _spPageContextInfo: any; declare var SP: any; @Injectable() export class FileUploadService { public siteUrl: string = _spPageContextInfo.webAbsoluteUrl; public siteRelativeUrl: string = _spPageContextInfo.webServerRelativeUrl != "/" ? _spPageContextInfo.webServerRelativeUrl : ""; public fileUpload(file: any, documentLibrary: string, fileName: string) { return new Promise((resolve, reject) => { this.createDummyFile(fileName, documentLibrary).then(result => { let fr = new FileReader(); let offset = 0; // the total file size in bytes... let total = file.size; // 1MB Chunks as represented in bytes (if the file is less than a MB, seperate it into two chunks of 80% and 20% the size)... let length = parseInt(1000000) > total ? Math.round(total * 0.8) : parseInt(1000000); let chunks = []; //reads in the file using the fileReader HTML5 API (as an ArrayBuffer) - readAsBinaryString is not available in IE! fr.readAsArrayBuffer(file); fr.onload = (evt: any) => { while (offset < total) { //if we are dealing with the final chunk, we need to know... if (offset + length > total) { length = total - offset; } //work out the chunks that need to be processed and the associated REST method (start, continue or finish) chunks.push({ offset, length, method: this.getUploadMethod(offset, length, total) }); offset += length; } //each chunk is worth a percentage of the total size of the file... const chunkPercentage = (total / chunks.length) / total * 100; if (chunks.length > 0) { //the unique guid identifier to be used throughout the upload session const id = this.guid(); //Start the upload - send the data to S this.uploadFile(evt.target.result, id, documentLibrary, fileName, chunks, 0, 0, chunkPercentage, resolve, reject); } }; }) }); } createDummyFile(fileName, libraryName) { return new Promise((resolve, reject) => { // Construct the endpoint - The GetList method is available for SharePoint Online only. var serverRelativeUrlToFolder = "decodedurl='" + this.siteRelativeUrl + "/" + libraryName + "'"; var endpoint = this.siteUrl + "/_api/Web/GetFolderByServerRelativePath(" + serverRelativeUrlToFolder + ")/files" + "/add(overwrite=true, url='" + fileName + "')" const headers = { "accept": "application/json;odata=verbose" }; this.executeAsync(endpoint, this.convertDataBinaryString(2), headers).then(file => resolve(true)).catch(err => reject(err)); }); } // Base64 - this method converts the blob arrayBuffer into a binary string to send in the REST request convertDataBinaryString(data) { let fileData = ''; let byteArray = new Uint8Array(data); for (var i = 0; i < byteArray.byteLength; i++) { fileData += String.fromCharCode(byteArray[i]); } return fileData; } executeAsync(endPointUrl, data, requestHeaders) { return new Promise((resolve, reject) => { // using a utils function we would get the APP WEB url value and pass it into the constructor... let executor = new SP.RequestExecutor(this.siteUrl); // Send the request. executor.executeAsync({ url: endPointUrl, method: "POST", body: data, binaryStringRequestBody: true, headers: requestHeaders, success: offset => resolve(offset), error: err => reject(err.responseText) }); }); } //this method sets up the REST request and then sends the chunk of file along with the unique indentifier (uploadId) uploadFileChunk(id, libraryPath, fileName, chunk, data, byteOffset) { return new Promise((resolve, reject) => { let offset = chunk.offset === 0 ? '' : ',fileOffset=' + chunk.offset; //parameterising the components of this endpoint avoids the max url length problem in SP (Querystring parameters are not included in this length) let endpoint = this.siteUrl + "/_api/web/getfilebyserverrelativeurl('" + this.siteRelativeUrl + "/" + libraryPath + "/" + fileName + "')/" + chunk.method + "(uploadId=guid'" + id + "'" + offset + ")"; const headers = { "Accept": "application/json; odata=verbose", "Content-Type": "application/octet-stream" }; this.executeAsync(endpoint, data, headers).then(offset => resolve(offset)).catch(err => reject(err)); }); } //the primary method that resursively calls to get the chunks and upload them to the library (to make the complete file) uploadFile(result, id, libraryPath, fileName, chunks, index, byteOffset, chunkPercentage, resolve, reject) { //we slice the file blob into the chunk we need to send in this request (byteOffset tells us the start position) const data = this.convertFileToBlobChunks(result, byteOffset, chunks[index]); //upload the chunk to the server using REST, using the unique upload guid as the identifier this.uploadFileChunk(id, libraryPath, fileName, chunks[index], data, byteOffset).then(value => { const isFinished = index === chunks.length - 1; index += 1; const percentageComplete = isFinished ? 100 : Math.round((index * chunkPercentage)); //More chunks to process before the file is finished, continue if (index < chunks.length) { this.uploadFile(result, id, libraryPath, fileName, chunks, index, byteOffset, chunkPercentage, resolve, reject); } else { resolve(value); } }).catch(err => { console.log('Error in uploadFileChunk! ' + err); reject(err); }); } //Helper method - depending on what chunk of data we are dealing with, we need to use the correct REST method... getUploadMethod(offset, length, total) { if (offset + length + 1 > total) { return 'finishupload'; } else if (offset === 0) { return 'startupload'; } else if (offset < total) { return 'continueupload'; } return null; } //this method slices the blob array buffer to the appropriate chunk and then calls off to get the BinaryString of that chunk convertFileToBlobChunks(result, byteOffset, chunkInfo) { let arrayBuffer = result.slice(chunkInfo.offset, chunkInfo.offset + chunkInfo.length); return this.convertDataBinaryString(arrayBuffer); } guid() { function s4() { return Math.floor((1 + Math.random()) * 0x10000).toString(16).substring(1); } return s4() + s4() + '-' + s4() + '-' + s4() + '-' + s4() + '-' + s4() + s4() + s4(); } } |
Step4
We should refer the SP.js and SP.RequestExecutor.js in deployed .aspx (or .html) pages to use default SharePoint service request executor to post file.
Output
Finally, the uploaded file will be available in SharePoint Documents Library.
SharePoint 2013 Hosting – HostForLIFE.eu :: How to Allow Access Requests Email Settings In SharePoint Site Using PnP PowerShell?
Feb 22nd
When new users get access denied to the site, they use the access request feature to inform the Site Owner to grant them the access.
From the Site Permissions -> Access Requests Settings -> Allow Access Requests feature has the option for setting the email. So, whenever a user requests access to the site/ file/folder, an email will be sent to notify the user to provide them the access.
If we remove the email from the “Allow Access Requests” settings, this feature gets disabled and the new user won’t notify the user when they need access.
To enable this email setting and send the notification to the owner, use the below PnP PowerShell snippets . $cred = Get-Credential
1 2 |
Connect-PnPOnline -Url https://<tenant>.sharepoint.com -Credential $cred Set-PnPRequestAccessEmails -Emails ktskumar@<tenant>.onmicrosoft.com |
To add multiple emails to the settings, use the below cmdlet.
1 |
Set-PnPRequestAccessEmails -Emails @( ktskumar@<tenant>.onmicrosoft.com; user1@<tenant>.onmicrosoft.com ) |
The below PnP Powershell snippet is used to get the emails associated to allow access requests settings. $cred = Get-Credential
1 2 |
Connect-PnPOnline -Url https://<tenant>.sharepoint.com -Credential $cred Get-PnPRequestAccessEmails |
SharePoint 2013 Hosting – HostForLIFE.eu :: How To Manage Policy And Procedure Documents In SharePoint?
Feb 15th
Nowadays, SharePoint is the most popular and productive document management system. It offers users ample opportunities, and it’s the best solution to manage policies and procedures.
Store Policies in Document Libraries (not on the file system)
SharePoint document libraries have numerous benefits for storing policies.
- customizable views
- ability to filter and sort
- addition of new columns / metadata
- ability to trigger workflows
- it allows documents to be managed and versioned
NOTE
A SharePoint workflow is made up of automated steps where a document gets approved by one or more people in the organization.
Store multiple versions of a Policy or Procedure and SharePoint Alerts
SharePoint document libraries can be configured to turn on “version management”. It allows the users to find the previous version of a document without latest changes or compare changes with the most current copy.
Also, users can check for new updates by SharePoint Alerts. It allows users to get notifications about any changes in policies or others event in SharePoint,
SharePoint Workspace
This application allows synchronizing the assets (such as policies) between a desktop and SharePoint. Users work with local copies of documents where there is no connection to SharePoint without browser access.
Next time a user connects to SharePoint, the Workspace synchronizes the assets again between a desktop and SharePoint.
Security model SharePoint
The security model allows users to control who can view, edit, or delete something. Moreover, this model grants permissions to the groups and allocates them among users.
Document ID Service
The Document ID Service in SharePoint has two major benefits.
- Users can identify each document in a site collection with a unique id of a document. It allows you to reduce the risk of ambiguity where two documents are named the same or similarly.
- Users have access to the document via a URL without location-specific information. It allows the link to remain intact despite the document moves.
SharePoint 2013 Hosting – HostForLIFE.eu :: Get Site Quota Information For SharePoint Farm
Jan 25th
Site Quota comes in handy when you wish to enforce the size limits in your SharePoint site collections. It lets you set the storage limit values and warning limit values, for better farm management. The below script will bring forth the site quota and other related information, such as maximum storage limit, maximum storage warning, etc. for the entire farm. The output will be in CSV format.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 |
if ((Get-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue) -eq $null) { Add-PSSnapin "Microsoft.SharePoint.PowerShell" } $templates = [Microsoft.SharePoint.Administration.SPWebService]::ContentService.quotatemplates $tFound = $false $templateName = "No template found" $results = @() $sites = Get-SPSite -Limit ALL try { foreach ($site in $sites) { foreach ($qt in $templates) { if ($qt.QuotaId -eq $site.Quota.QuotaID) { $templateName = $qt.Name; $tFound = $true } } if ($tFound -eq $false) { $templateName = “No Template } $tFound=$false; $RowDetails = @{ "Site URL" = $site.Url "Storage Used" = $site.Usage.Storage/1MB "Storage Available Warning" = $site.Quota.StorageWarningLevel/1MB "Storage Available Maximum" = $site.Quota.StorageMaximumLevel/1MB "Sandboxed Resource Points Warning" = $site.Quota.UserCodeWarningLevel "Sandboxed Resource Points Maximum" = $site.Quota.UserCodeMaximumLevel "Quota Name" = $templateName } $results += New-Object PSObject -Property $RowDetails $site.Dispose() } } catch { $e = $_.Exception $line = $_.InvocationInfo.ScriptLineNumber $msg = $e.Message Write-Host -ForegroundColor Red "caught exception: $e at $line" Write-Host $msg write-host "Something went wrong" } $results | Export-csv -Path C:\SiteQuotaDetailedInfo.csv -NoTypeInformation Write-Host "-------------------- Completed! -----------------------------" |
SharePoint 2013 Hosting – HostForLIFE.eu :: Get Content Type Usage Details In SharePoint Using Powershell
Jan 18th
This script will give you the usage details and dependencies of any content type, such as – list URL, Web URL, etc. and wherever that is being used. This might come in handy when you wish to delete the content types and you have to make sure of all its dependencies and references getting cleaned.
The output will be in the form of a CSV file where the above-mentioned content type is used. Input parameters used are:
Name of content type
Web Application URL
To get started, replace Content Type name and Web Application URL below with the one to be searched for,
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 |
$myContentType = "Report Builder Report" $myWebApp = "http://myWebAppURL.com" if ((Get -PSSnapin "Microsoft.SharePoint.PowerShell" - ErrorAction SilentlyContinue) -eq $null) { Add -PSSnapin "Microsoft.SharePoint.PowerShell" } @results = @() $sites = Get -SPSite –Limit All –WebApplication $myWebApp try { foreach($site in $sites) { Write-Host "Looking in site– " $site.Url foreach($web in $site.AllWebs) { Write-Host $web.Url foreach($list in $web.lists) { foreach($ctype in $list.ContentTypes | ? { $_.Name –eq $myContentType}) { foreach($item in $list.Items | ? { $_.ContentType.Name –eq $myContentType }) { $RowDetails = @ { "List" = $list.Title "ListURL" = $list.DefaultView.Url "ContentType" = $item.ContentType.Name "Site" = $site.Url "Web" = $web.Url } $results += New-Object PSObject –Property $RowDetails break } } } $web.Dispose() } $site.Dispose() } } catch { $e = $_.Exception $line = $_.InvocationInfo.ScriptLineNumber $msg = $e.Message Write-Host –ForegroundColor Red "Caught Exception: $e at $line" Write-Host $msg Write-Host "Something went wrong" } $results | Export - Path C: \ContentTypeUsageReport. NoTypeInformation Write-Host " === === === === === === === Completed! === === === === === === === === == " |
SharePoint 2013 Hosting – HostForLIFE.eu :: Get All Users In A SharePoint 2010/13/16 Farm Using PowerShell Script
Jan 11th
Welcome to an article on how to get all users in a SharePoint 2010/13/16 farm using PowerShell Script. This script is going to run and loop in all the web applications and site collections within them and get individual and group users.
This script will help save us developers a lot of time in getting all the users from an individual or group. So, here is the script.
- Copy the code below to a .ps1 file.
- Run the .ps1 file on the SharePoint PowerShell modules.
- You don’t need to do any update on the script.
Script
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 |
#getalluserinthefarm Add-PSSnapin microsoft.sharepoint.powershell -ErrorAction SilentlyContinue $Currentime = get-date -format "yyyyMMdd_hhmmtt" $filename = "FarmUsers" $datafile = ("{0}{1}.csv" -f $filename, $Currentime) $headerfile = "type,user,group,weburl,webtitle" $headerfile | out-file -FilePath $datafile $iissitedata = get-spwebapplication foreach($farmsite in $iissitedata) { foreach ($SiteCollection in $farmsite.sites) { write-host $SiteCollection -foregroundcolor Blue foreach ($web in $SiteCollection.Allwebs) { write-host " " $web.url $web.name "users:" -foregroundcolor yellow foreach ($usersite in $web.users) { write-host " " $usersite -foregroundcolor white $data = ("RootUser,{0},-,{1},{2}" -f $usersite, $web.url,$web.name) $data | out-file -FilePath $datafile -append } foreach ($group in $web.Groups) { Write-host " " $web.url $group.name: -foregroundcolor green foreach ($user in $group.users) { Write-host " " $user -foregroundcolor white $data = ("GroupUser,{0},{1},{2},{3}" -f $user, $group, $web.url, $web.name) $data | out-file -FilePath $datafile -append } } $web.Dispose() } } } |
Once you initiate the script, it will run through all site collections in all the webs and populate the date and save it in on the same location you are running it from, in a .csv or .txt format.
Just run this script and you will get all the users on the farm.