Logging While Drilling Analysis: PLT Style Use Project, Well

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 21

1/6/2021 LWD_F-14

In [6]: import pandas as pd


import numpy as np
import matplotlib.pyplot as plt
plt.style.use('seaborn-darkgrid')
import seaborn as sns
from welly import Project, Well
import lasio

Logging While Drilling Analysis


Depth indexed data

17 1/2" LWD Section 12 1/4" LWD Section 8 1/2" LWD Section Final Dataframe

Time indexed data

In [4]: ls *.LAS

Volume in drive C has no label.


Volume Serial Number is CEC9-4320

Directory of C:\Users\Luis Navarro\Desktop\ProgrammingStuff\PetroleumData\Eq


uinor Volve\Drilling\15_9-F-14

01/05/2021 03:34 PM 1,749,518 WL_RAW_BHPR-GR-MECH_MWD_1.LAS


01/05/2021 03:33 PM 744,995 WL_RAW_BHPR-GR-MECH_MWD_2.LAS
01/05/2021 03:34 PM 1,349,449 WL_RAW_BHPR-GR-MECH_MWD_3.LAS
01/05/2021 03:36 PM 11,000,465 WL_RAW_BHPR-GR-MECH_TIME_MWD_1.LAS
01/05/2021 03:35 PM 6,263,697 WL_RAW_BHPR-GR-MECH_TIME_MWD_2.LAS
01/05/2021 03:36 PM 9,642,380 WL_RAW_BHPR-GR-MECH_TIME_MWD_3.LAS
6 File(s) 30,750,504 bytes
0 Dir(s) 6,370,004,992 bytes free

Depth indexed data

In [122]: f1 = Project.from_las("WL_RAW_BHPR-GR-MECH_MWD_1.LAS")
f2 = Project.from_las("WL_RAW_BHPR-GR-MECH_MWD_2.LAS")
f3 = Project.from_las("WL_RAW_BHPR-GR-MECH_MWD_3.LAS")

0it [00:00, ?it/s]C:\ProgramData\Anaconda3\lib\site-packages\welly\well.py:19


2: FutureWarning: From v0.5 the default will be 'original', keeping whatever
is used in the LAS file. If you want to force conversion to metres, change yo
ur code to use `index='m'`.
warnings.warn(m, FutureWarning)
1it [00:01, 1.95s/it]
1it [00:00, 1.15it/s]
1it [00:01, 1.68s/it]

localhost:8888/nbconvert/html/Desktop/ProgrammingStuff/PetroleumData/Equinor Volve/Drilling/15_9-F-14/LWD_F-14.ipynb?download=false#id15 1/21


1/6/2021 LWD_F-14

In [123]: print('17 1/2" Section:') ; f1

17 1/2" Section:

Out[123]: Index UWI Data Curves

ROP5, ARC_GR_UNC_RT, SWOB, TQA, RPM, STICK_RT, CRPM_RT, TFLO,


0 18 curves TRPM_RT, SPPA, SHKRSK_RT, SHK2_RT, SHK_ISONIC_RT, PDSHKRSK,
SHKPK_RT, ECD_ARC, APRS_ARC, ATMP

In [124]: print('12 1/4" Section:') ; f2

12 1/4" Section:

Out[124]: Index UWI Data Curves

ROP5, ARC_GR_RT, SWOB, TQA, STICK_RT, SPPA, TRPM_RT, TFLO, RPM,


0 18 curves CRPM_RT, SHKRSK_RT, SHK2_RT, ECD_ARC, APRS_ARC, ATMP, SPM1,
SPM2, SPM3

In [125]: print('8 1/2" Section:') ; f3

8 1/2" Section:

Out[125]: Index UWI Data Curves

ROP5, GRMA_ECO_RT, STICK_RT, SWOB, SHKRSK_RT, SHKPK_RT, RPM,


0 18 curves TRPM_RT, CRPM_RT, TFLO, DHAT, DHAP, TQA, ECD, SPPA, SPM1, SPM2,
SPM3

DataFrame construction & Exploratory Data Analysis


sec_1 = 17 1/2" Section
sec_2 = 12 1/4" Section
sec_3 = 8 1/2" Section

In [316]: sec_1 = f1.df()


sec_2 = f2.df()
sec_3 = f3.df()

In [166]: print("Common columns between 17.5LWD and 12.25LWD:\t")


np.intersect1d(sec_1.columns, sec_2.columns)

Common columns between 17.5LWD and 12.25LWD:

Out[166]: array(['APRS_ARC', 'ATMP', 'CRPM_RT', 'Depth', 'ECD_ARC', 'ROP5', 'RPM',


'SHK2_RT', 'SHKRSK_RT', 'SPPA', 'STICK_RT', 'SWOB', 'TFLO', 'TQA',
'TRPM_RT'], dtype=object)

In [167]: print("Common columns between 12.25LWD and 8.5LWD:\t")


np.intersect1d(sec_2.columns, sec_3.columns)

Common columns between 12.25LWD and 8.5LWD:

Out[167]: array(['CRPM_RT', 'Depth', 'ROP5', 'RPM', 'SHKRSK_RT', 'SPM1', 'SPM2',


'SPM3', 'SPPA', 'STICK_RT', 'SWOB', 'TFLO', 'TQA', 'TRPM_RT'],
dtype=object)

17 1/2" LWD Section analysis


Turn MiltiIndex data structure into RangeIndex structure

localhost:8888/nbconvert/html/Desktop/ProgrammingStuff/PetroleumData/Equinor Volve/Drilling/15_9-F-14/LWD_F-14.ipynb?download=false#id15 2/21


1/6/2021 LWD_F-14

In [318]: sec_1.reset_index(inplace=True)
del sec_1["UWI"]

In [136]: print("Depths\n\tInitial :",sec_1["Depth"].min(),"[m]\n\tFinal:",sec_1["Depth"


].max(),"[m]")
print("\nTotal depth =",sec_1["Depth"].max()-sec_1["Depth"].min(),"[m]")
print("\nColumns in DataFrame (logs) :")
for col in sec_1.columns:
print("\t",col)

Depths
Initial : 1050.036 [m]
Final: 2281.5803999995414 [m]

Total depth = 1231.5443999995414 [m]

Columns in DataFrame (logs) :


Depth
ROP5
ARC_GR_UNC_RT
SWOB
TQA
RPM
STICK_RT
CRPM_RT
TFLO
TRPM_RT
SPPA
SHKRSK_RT
SHK2_RT
SHK_ISONIC_RT
PDSHKRSK
SHKPK_RT
ECD_ARC
APRS_ARC
ATMP

localhost:8888/nbconvert/html/Desktop/ProgrammingStuff/PetroleumData/Equinor Volve/Drilling/15_9-F-14/LWD_F-14.ipynb?download=false#id15 3/21


1/6/2021 LWD_F-14

In [137]: print("Logs:\n")
sec_1_las = r"WL_RAW_BHPR-GR-MECH_MWD_1.LAS"
sec1_plt = lasio.read(sec_1_las)
fig,axes = plt.subplots(1,len(sec1_plt.keys()), figsize=(20,20))
for i,log in enumerate(sec1_plt.keys()):
axes[i].plot(sec1_plt[log],sec1_plt['DEPT'])
axes[i].invert_yaxis()
axes[i].set_title(log,fontsize=8.5)

Logs:

localhost:8888/nbconvert/html/Desktop/ProgrammingStuff/PetroleumData/Equinor Volve/Drilling/15_9-F-14/LWD_F-14.ipynb?download=false#id15 4/21


1/6/2021 LWD_F-14

In [89]: print("Missing data in the LWD data:")


plt.figure(figsize=(15,7))
sns.heatmap(sec_1.isnull(),cbar=False)
plt.show()

Missing data in the LWD data:

In [94]: total = sec_1.isna().sum().sort_values(ascending = True)


percent = round(((sec_1.isna().sum()/sec_1.isna().count())*100),2).sort_values
(ascending = True)
missing_data = pd.concat([total, percent], axis = 1, keys = ["Total ","Percen
t"])
print("Missing data:")
missing_data

Missing data:

Out[94]:
Total Percent

Depth 0 0.00

SWOB 0 0.00

TQA 0 0.00

RPM 0 0.00

SPPA 0 0.00

TFLO 0 0.00

ROP5 14 0.17

ECD_ARC 255 3.16

APRS_ARC 255 3.16

ATMP 255 3.16

ARC_GR_UNC_RT 2714 33.58

STICK_RT 4855 60.07

SHK2_RT 4856 60.08

PDSHKRSK 4856 60.08

TRPM_RT 4858 60.11

CRPM_RT 4859 60.12

SHKRSK_RT 4860 60.13

SHKPK_RT 4860 60.13

SHK_ISONIC_RT 5056 62.56

localhost:8888/nbconvert/html/Desktop/ProgrammingStuff/PetroleumData/Equinor Volve/Drilling/15_9-F-14/LWD_F-14.ipynb?download=false#id15 5/21


1/6/2021 LWD_F-14

12 1/4" LWD Section analysis


LWD Data columns from the 12 1/4" section equals data columns from 8 1/2" section

Turn MiltiIndex data structure into RangeIndex structure

In [95]: sec_2

Out[95]:
ROP5 ARC_GR_RT SWOB TQA STICK_RT SPPA TRPM_RT TFL

UWI Depth

2250.0336 NaN 91.8983 1.8952 7.6334 39.0 120.4109 NaN 2666.99

2250.1860 NaN NaN 2.2988 8.5433 NaN 121.8338 1679.6875 2666.99

2250.3384 19.7809 99.6068 2.5225 8.2682 33.0 123.0028 NaN 2666.99

2250.4908 19.7804 91.8983 2.5204 9.2574 NaN 123.3786 1679.6875 2666.99

2250.6432 19.7728 91.8983 2.5110 9.3463 NaN 123.6700 NaN 2666.99

... ... ... ... ... ... ... ...

2787.2436 30.0162 NaN 12.7218 9.3756 NaN 201.2531 NaN 3227.49

2787.3960 NaN NaN NaN NaN NaN NaN NaN Na

2787.5484 NaN NaN NaN NaN NaN NaN NaN Na

2787.7008 NaN NaN NaN NaN NaN NaN NaN Na

2787.8532 NaN NaN NaN NaN NaN NaN NaN Na

3530 rows × 18 columns

In [319]: sec_2.reset_index(inplace=True)
del sec_2["UWI"]

localhost:8888/nbconvert/html/Desktop/ProgrammingStuff/PetroleumData/Equinor Volve/Drilling/15_9-F-14/LWD_F-14.ipynb?download=false#id15 6/21


1/6/2021 LWD_F-14

In [139]: print("Depths\n\tInitial :",sec_2["Depth"].min(),"[m]\n\tFinal:",sec_2["Depth"


].max(),"[m]")
print("\nTotal depth =",sec_2["Depth"].max()-sec_2["Depth"].min(),"[m]")
print("\nColumns in DataFrame (logs) :")
for col in sec_2.columns:
print("\t",col)

Depths
Initial : 2250.0336 [m]
Final: 2787.8531999998 [m]

Total depth = 537.8195999997997 [m]

Columns in DataFrame (logs) :


Depth
ROP5
ARC_GR_RT
SWOB
TQA
STICK_RT
SPPA
TRPM_RT
TFLO
RPM
CRPM_RT
SHKRSK_RT
SHK2_RT
ECD_ARC
APRS_ARC
ATMP
SPM1
SPM2
SPM3

localhost:8888/nbconvert/html/Desktop/ProgrammingStuff/PetroleumData/Equinor Volve/Drilling/15_9-F-14/LWD_F-14.ipynb?download=false#id15 7/21


1/6/2021 LWD_F-14

In [143]: print("Logs:\n")
sec_2_las = r"WL_RAW_BHPR-GR-MECH_MWD_2.LAS"
sec2_plt = lasio.read(sec_2_las)
fig,axes = plt.subplots(1,len(sec2_plt.keys()), figsize=(20,20))
for i,log in enumerate(sec2_plt.keys()):
axes[i].plot(sec2_plt[log],sec2_plt['DEPT'])
axes[i].invert_yaxis()
axes[i].set_title(log,fontsize=8.5)

Logs:

In [104]: print("Missing data in the LWD data:")


plt.figure(figsize=(15,7))
sns.heatmap(sec_2.isnull(),cbar=False)
plt.show()

Missing data in the LWD data:

localhost:8888/nbconvert/html/Desktop/ProgrammingStuff/PetroleumData/Equinor Volve/Drilling/15_9-F-14/LWD_F-14.ipynb?download=false#id15 8/21


1/6/2021 LWD_F-14

In [115]: total = 0 ; percent = 0


total = sec_2.isna().sum().sort_values(ascending = True)
percent = round(((sec_2.isna().sum()/sec_2.isna().count())*100),2).sort_values
(ascending = True)
missing_data = pd.concat([total, percent], axis = 1, keys = ["Total ","Percen
t"])
print("Missing data:")
missing_data

Missing data:

Out[115]:
Total Percent

Depth 0 0.00

SPM1 4 0.11

SPM2 4 0.11

TFLO 4 0.11

SPPA 4 0.11

RPM 4 0.11

TQA 4 0.11

SWOB 4 0.11

SPM3 4 0.11

ROP5 10 0.28

ECD_ARC 181 5.13

APRS_ARC 181 5.13

ATMP 181 5.13

ARC_GR_RT 1999 56.63

SHK2_RT 2458 69.63

SHKRSK_RT 2543 72.04

STICK_RT 2602 73.71

CRPM_RT 2605 73.80

TRPM_RT 2617 74.14

8 1/2" LWD Section analysis


Turn MiltiIndex data structure into RangeIndex structure

In [320]: sec_3.reset_index(inplace=True)
del sec_3["UWI"]

localhost:8888/nbconvert/html/Desktop/ProgrammingStuff/PetroleumData/Equinor Volve/Drilling/15_9-F-14/LWD_F-14.ipynb?download=false#id15 9/21


1/6/2021 LWD_F-14

In [141]: print("Depths\n\tInitial :",sec_3["Depth"].min(),"[m]\n\tFinal:",sec_3["Depth"


].max(),"[m]")
print("\nTotal depth =",sec_3["Depth"].max()-sec_3["Depth"].min(),"[m]")
print("\nColumns in DataFrame (logs) :")
for col in sec_3.columns:
print("\t",col)

Depths
Initial : 2778.0996 [m]
Final: 3749.954399999638 [m]

Total depth = 971.8547999996381 [m]

Columns in DataFrame (logs) :


Depth
ROP5
GRMA_ECO_RT
STICK_RT
SWOB
SHKRSK_RT
SHKPK_RT
RPM
TRPM_RT
CRPM_RT
TFLO
DHAT
DHAP
TQA
ECD
SPPA
SPM1
SPM2
SPM3

localhost:8888/nbconvert/html/Desktop/ProgrammingStuff/PetroleumData/Equinor Volve/Drilling/15_9-F-14/LWD_F-14.ipynb?download=false#id15 10/21


1/6/2021 LWD_F-14

In [144]: print("Logs:\n")
sec_3_las = r"WL_RAW_BHPR-GR-MECH_MWD_3.LAS"
sec3_plt = lasio.read(sec_3_las)
fig,axes = plt.subplots(1,len(sec3_plt.keys()), figsize=(20,20))
for i,log in enumerate(sec3_plt.keys()):
axes[i].plot(sec3_plt[log],sec3_plt['DEPT'])
axes[i].invert_yaxis()
axes[i].set_title(log,fontsize=8.5)

Logs:

In [145]: print("Missing data in the LWD data:")


plt.figure(figsize=(15,7))
sns.heatmap(sec_3.isnull(),cbar=False)
plt.show()

Missing data in the LWD data:

localhost:8888/nbconvert/html/Desktop/ProgrammingStuff/PetroleumData/Equinor Volve/Drilling/15_9-F-14/LWD_F-14.ipynb?download=false#id15 11/21


1/6/2021 LWD_F-14

In [146]: del missing_data ; total = 0 ; percent = 0


total = sec_3.isna().sum().sort_values(ascending = True)
percent = round(((sec_3.isna().sum()/sec_3.isna().count())*100),2).sort_values
(ascending = True)
missing_data = pd.concat([total, percent], axis = 1, keys = ["Total ","Percen
t"])
print("Missing data:")
missing_data

Missing data:

Out[146]:
Total Percent

Depth 0 0.00

SPM1 0 0.00

SPPA 0 0.00

TQA 0 0.00

TFLO 0 0.00

SPM2 0 0.00

RPM 0 0.00

SPM3 0 0.00

SWOB 0 0.00

ROP5 4 0.06

DHAT 65 1.02

DHAP 65 1.02

ECD 65 1.02

GRMA_ECO_RT 2209 34.63

TRPM_RT 3998 62.68

SHKRSK_RT 3999 62.70

SHKPK_RT 3999 62.70

STICK_RT 4002 62.75

CRPM_RT 4007 62.83

Creating final DataFrame

Regardless, the LWD tool in the BHA of each run:

ArcVision Tool : Sections 17 1/2" & 12 1/4"


EcoScope Tool : Section 8 1/2"

An stadistical description shall be done to check the possibility of concatenating values of Gamma
Ray,Equivalent Density, Annulus temperature and pressure between the tools

localhost:8888/nbconvert/html/Desktop/ProgrammingStuff/PetroleumData/Equinor Volve/Drilling/15_9-F-14/LWD_F-14.ipynb?download=false#id15 12/21


1/6/2021 LWD_F-14

In [220]: fig, axs = plt.subplots(1,3,figsize=(10,7),sharey = True)


fig.suptitle('Gamma Ray Values')
axs[0].boxplot(sec_1["ARC_GR_UNC_RT"].dropna())
axs[0].set_title("Section 17")
axs[1].boxplot(sec_2["ARC_GR_RT"].dropna())
axs[1].set_title("Section 12")
axs[2].boxplot(sec_3["GRMA_ECO_RT"].dropna())
axs[2].set_title("Section 8")
plt.show()

localhost:8888/nbconvert/html/Desktop/ProgrammingStuff/PetroleumData/Equinor Volve/Drilling/15_9-F-14/LWD_F-14.ipynb?download=false#id15 13/21


1/6/2021 LWD_F-14

In [221]: fig, axs = plt.subplots(1,3,figsize=(10,7),sharey = True)


fig.suptitle('Annulus Pressures')
axs[0].boxplot(sec_1["APRS_ARC"].dropna())
axs[0].set_title("Section 17")
axs[1].boxplot(sec_2["APRS_ARC"].dropna())
axs[1].set_title("Section 12")
axs[2].boxplot(sec_3["DHAP"].dropna())
axs[2].set_title("Section 8")
plt.show()

localhost:8888/nbconvert/html/Desktop/ProgrammingStuff/PetroleumData/Equinor Volve/Drilling/15_9-F-14/LWD_F-14.ipynb?download=false#id15 14/21


1/6/2021 LWD_F-14

In [222]: fig, axs = plt.subplots(1,3,figsize=(10,7),sharey = True)


fig.suptitle('Annulus Temperature')
axs[0].boxplot(sec_1["ATMP"].dropna())
axs[0].set_title("Section 17")
axs[1].boxplot(sec_2["ATMP"].dropna())
axs[1].set_title("Section 12")
axs[2].boxplot(sec_3["DHAT"].dropna())
axs[2].set_title("Section 8")
plt.show()

localhost:8888/nbconvert/html/Desktop/ProgrammingStuff/PetroleumData/Equinor Volve/Drilling/15_9-F-14/LWD_F-14.ipynb?download=false#id15 15/21


1/6/2021 LWD_F-14

In [271]: fig, axs = plt.subplots(1,3,figsize=(10,7),sharey = True)


fig.suptitle('Equivalent Circulating Density')
axs[0].boxplot(sec_1["ECD_ARC"].dropna())
axs[0].set_title("Section 17")
axs[1].boxplot(sec_2["ECD_ARC"].dropna())
axs[1].set_title("Section 12")
axs[2].boxplot(sec_3["ECD"].dropna())
axs[2].set_title("Section 8")
plt.show()

The shock tools also shall be analized for a complete construction of the final dataframe from section 17 1/2" to 8
1/2", while analyzing with boxplots, the outliers wont let analizing possible, so the python builtup description tool
is used

localhost:8888/nbconvert/html/Desktop/ProgrammingStuff/PetroleumData/Equinor Volve/Drilling/15_9-F-14/LWD_F-14.ipynb?download=false#id15 16/21


1/6/2021 LWD_F-14

In [237]: print(sec_1["SHK2_RT"].describe())
print(sec_2["SHK2_RT"].describe())
print(sec_3["SHKPK_RT"].describe())

count 3226.000000
mean 0.000310
std 0.017606
min 0.000000
25% 0.000000
50% 0.000000
75% 0.000000
max 1.000000
Name: SHK2_RT, dtype: float64
count 1072.000000
mean 0.067164
std 0.307340
min 0.000000
25% 0.000000
50% 0.000000
75% 0.000000
max 3.000000
Name: SHK2_RT, dtype: float64
count 2379.000000
mean 0.860866
std 13.007292
min 0.000000
25% 0.000000
50% 0.000000
75% 0.000000
max 504.000000
Name: SHKPK_RT, dtype: float64

The final result for building the columns, should be as follows:

Depth ROP5 GammaRay SWOB TQA RPM TRPM CRPM Stick SPPA TFLO AnTemp AnPres TransShock

from depths : 1050 - 3749[m]

In [321]: pre_1 = sec_1[['Depth','ROP5','ARC_GR_UNC_RT','SWOB','TQA','RPM','TRPM_RT','CR


PM_RT','STICK_RT','SPPA','TFLO','ATMP','APRS_ARC','SHKRSK_RT','ECD_ARC']]
pre_2 = sec_2[['Depth','ROP5','ARC_GR_RT','SWOB','TQA','RPM','TRPM_RT','CRPM_R
T','STICK_RT','SPPA','TFLO','ATMP','APRS_ARC','SHKRSK_RT','ECD_ARC']]
pre_3 = sec_3[['Depth','ROP5','GRMA_ECO_RT','SWOB','TQA','RPM','TRPM_RT','CRPM
_RT','STICK_RT','SPPA','TFLO','DHAT','DHAP','SHKRSK_RT','ECD']]

In [322]: pre_1 = pre_1.rename(columns={'ARC_GR_UNC_RT':'GammaRay','ATMP':'AnTemp','APRS


_ARC':'AnPres','ECD_ARC':'ECD'})
pre_2 = pre_2.rename(columns={'ARC_GR_RT':'GammaRay','ATMP':'AnTemp','APRS_AR
C':'AnPres','ECD_ARC':'ECD'})
pre_3 = pre_3.rename(columns={'GRMA_ECO_RT':'GammaRay','DHAT':'AnTemp','DHAP':
'AnPres'})

In [323]: print("Equal columns between preliminary dataframes:\n")


print(pre_1.columns == pre_2.columns)
print("\n",pre_1.columns == pre_3.columns)

Equal columns between preliminary dataframes:

[ True True True True True True True True True True True True
True True True]

[ True True True True True True True True True True True True
True True True]

localhost:8888/nbconvert/html/Desktop/ProgrammingStuff/PetroleumData/Equinor Volve/Drilling/15_9-F-14/LWD_F-14.ipynb?download=false#id15 17/21


1/6/2021 LWD_F-14

In [324]: print("Pre_1\nDepths\n\tInitial :",pre_1["Depth"].min(),"[m]\n\tFinal:",pre_1[


"Depth"].max(),"[m]")
print("Pre_2\nDepths\n\tInitial :",pre_2["Depth"].min(),"[m]\n\tFinal:",pre_2[
"Depth"].max(),"[m]")
print("Pre_3\nDepths\n\tInitial :",pre_3["Depth"].min(),"[m]\n\tFinal:",pre_3[
"Depth"].max(),"[m]")

Pre_1
Depths
Initial : 1050.036 [m]
Final: 2281.5803999995414 [m]
Pre_2
Depths
Initial : 2250.0336 [m]
Final: 2787.8531999998 [m]
Pre_3
Depths
Initial : 2778.0996 [m]
Final: 3749.954399999638 [m]

In [327]: pre_2 = pre_2[207:]


pre_2 = pre_2.reset_index(drop=True)

In [339]: pre_3 = pre_3[65:]


pre_3 = pre_3.reset_index(drop=True)

In [344]: print("\tFinal depths: ")


print("Pre_1\nDepths\n\tInitial :",pre_1["Depth"].min(),"[m]\n\tFinal:",pre_1[
"Depth"].max(),"[m]")
print("Pre_2\nDepths\n\tInitial :",pre_2["Depth"].min(),"[m]\n\tFinal:",pre_2[
"Depth"].max(),"[m]")
print("Pre_3\nDepths\n\tInitial :",pre_3["Depth"].min(),"[m]\n\tFinal:",pre_3[
"Depth"].max(),"[m]")

Final depths:
Pre_1
Depths
Initial : 1050.036 [m]
Final: 2281.5803999995414 [m]
Pre_2
Depths
Initial : 2281.5803999999885 [m]
Final: 2787.8531999998 [m]
Pre_3
Depths
Initial : 2788.0055999999963 [m]
Final: 3749.954399999638 [m]

In [349]: frames = [pre_1,pre_2,pre_3]


LWD_F14 = pd.concat(frames)
LWD_F14 = LWD_F14.reset_index(drop=True)

In [351]: print("Total rows in preliminary dataframes : ",len(pre_1)+len(pre_2)+len(pre_


3))

Total rows in preliminary dataframes : 17718

localhost:8888/nbconvert/html/Desktop/ProgrammingStuff/PetroleumData/Equinor Volve/Drilling/15_9-F-14/LWD_F-14.ipynb?download=false#id15 18/21


1/6/2021 LWD_F-14

In [352]: LWD_F14.info()

<class 'pandas.core.frame.DataFrame'>
RangeIndex: 17718 entries, 0 to 17717
Data columns (total 15 columns):
# Column Non-Null Count Dtype
--- ------ -------------- -----
0 Depth 17718 non-null float64
1 ROP5 17694 non-null float64
2 GammaRay 10886 non-null float64
3 SWOB 17714 non-null float64
4 TQA 17714 non-null float64
5 RPM 17714 non-null float64
6 TRPM_RT 6386 non-null float64
7 CRPM_RT 6390 non-null float64
8 STICK_RT 6404 non-null float64
9 SPPA 17714 non-null float64
10 TFLO 17714 non-null float64
11 AnTemp 17217 non-null float64
12 AnPres 17217 non-null float64
13 SHKRSK_RT 6457 non-null float64
14 ECD 17217 non-null float64
dtypes: float64(15)
memory usage: 2.0 MB

In [353]: LWD_F14.to_csv(r'C:\Users\Luis Navarro\Desktop\ProgrammingStuff\MachineLearnin


g\Petroleum\Drilling\LWD_F14.csv', index = False)

Time indexed data

Welly library import


In [260]: print("Welly version: ",welly.__version__)

Welly version: 0.4.8

localhost:8888/nbconvert/html/Desktop/ProgrammingStuff/PetroleumData/Equinor Volve/Drilling/15_9-F-14/LWD_F-14.ipynb?download=false#id15 19/21


1/6/2021 LWD_F-14

In [261]: t1 = welly.Well.from_las("WL_RAW_BHPR-GR-MECH_TIME_MWD_1.LAS",index="existing"
)

---------------------------------------------------------------------------
UFuncTypeError Traceback (most recent call last)
<ipython-input-261-11b34f59b6e2> in <module>
----> 1 t1 = welly.Well.from_las("WL_RAW_BHPR-GR-MECH_TIME_MWD_1.LAS",index=
"existing")

C:\ProgramData\Anaconda3\lib\site-packages\welly\well.py in from_las(cls, fna


me, remap, funcs, data, req, alias, encoding, printfname, index)
334 alias=alias,
335 fname=fname,
--> 336 index=index)
337
338 def df(self, keys=None, basis=None, uwi=False, alias=None, rename
_aliased=True):

C:\ProgramData\Anaconda3\lib\site-packages\welly\well.py in from_lasio(cls,
l, remap, funcs, data, req, alias, fname, index)
243 elif data and not req:
244 curves = {c.mnemonic: Curve.from_lasio_curve(c, **curve_p
arams)
--> 245 for c in l.curves
246 if (c.mnemonic[:4] not in depth_curves)}
247 elif (not data) and req:

C:\ProgramData\Anaconda3\lib\site-packages\welly\well.py in <dictcomp>(.0)
244 curves = {c.mnemonic: Curve.from_lasio_curve(c, **curve_p
arams)
245 for c in l.curves
--> 246 if (c.mnemonic[:4] not in depth_curves)}
247 elif (not data) and req:
248 curves = {c.mnemonic: True

C:\ProgramData\Anaconda3\lib\site-packages\welly\curve.py in from_lasio_curve
(cls, curve, depth, basis, start, stop, step, run, null, service_company, dat
e, basis_units)
209 # See if we have uneven sampling.
210 if depth is not None:
--> 211 d = np.diff(depth)
212 if not np.allclose(d - np.mean(d), np.zeros_like(d)):
213 # Sampling is uneven.

<__array_function__ internals> in diff(*args, **kwargs)

C:\ProgramData\Anaconda3\lib\site-packages\numpy\lib\function_base.py in diff
(a, n, axis, prepend, append)
1267 op = not_equal if a.dtype == np.bool_ else subtract
1268 for _ in range(n):
-> 1269 a = op(a[slice1], a[slice2])
1270
1271 return a

UFuncTypeError: ufunc 'subtract' did not contain a loop with signature matchi
ng types (dtype('<U9'), dtype('<U9')) -> dtype('<U9')

localhost:8888/nbconvert/html/Desktop/ProgrammingStuff/PetroleumData/Equinor Volve/Drilling/15_9-F-14/LWD_F-14.ipynb?download=false#id15 20/21


1/6/2021 LWD_F-14

In [267]: from welly import Well


w = Well.from_las('WL_RAW_BHPR-GR-MECH_TIME_MWD_1.LAS')

---------------------------------------------------------------------------
LASUnknownUnitError Traceback (most recent call last)
<ipython-input-267-5189a1a074e3> in <module>
1 from welly import Well
----> 2 w = Well.from_las('WL_RAW_BHPR-GR-MECH_TIME_MWD_1.LAS')

C:\ProgramData\Anaconda3\lib\site-packages\welly\well.py in from_las(cls, fna


me, remap, funcs, data, req, alias, encoding, printfname, index)
334 alias=alias,
335 fname=fname,
--> 336 index=index)
337
338 def df(self, keys=None, basis=None, uwi=False, alias=None, rename
_aliased=True):

C:\ProgramData\Anaconda3\lib\site-packages\welly\well.py in from_lasio(cls,
l, remap, funcs, data, req, alias, fname, index)
209
210 # Select the relevant index from the lasio object.
--> 211 l_index = getattr(l, index_attr)
212
213 # Build a dict of curves.

C:\ProgramData\Anaconda3\lib\site-packages\lasio\las.py in depth_m(self)
809 return (self.index / 120) * 0.3048
810 else:
--> 811 raise exceptions.LASUnknownUnitError("Unit of depth index
not known")
812
813 @property

LASUnknownUnitError: Unit of depth index not known

To analyze this logs, intermediate procedure must be done also


further analysis and research is needed
In [ ]:

localhost:8888/nbconvert/html/Desktop/ProgrammingStuff/PetroleumData/Equinor Volve/Drilling/15_9-F-14/LWD_F-14.ipynb?download=false#id15 21/21

You might also like